Dec 01 09:14:41 crc systemd[1]: Starting Kubernetes Kubelet... Dec 01 09:14:41 crc restorecon[4654]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:41 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 01 09:14:42 crc restorecon[4654]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 01 09:14:42 crc kubenswrapper[4763]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:14:42 crc kubenswrapper[4763]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 01 09:14:42 crc kubenswrapper[4763]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:14:42 crc kubenswrapper[4763]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:14:42 crc kubenswrapper[4763]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 01 09:14:42 crc kubenswrapper[4763]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.838896 4763 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843379 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843410 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843420 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843428 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843437 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843446 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843478 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843486 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843494 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843502 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843513 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843523 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843532 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843540 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843548 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843557 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843572 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843580 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843589 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843597 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843609 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843620 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843630 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843641 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843650 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843659 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843669 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843677 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843686 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843696 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843705 4763 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843714 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843722 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843730 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843737 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843745 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843753 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843763 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843770 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843778 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843786 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843793 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843801 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843809 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843817 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843824 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843832 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843840 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843847 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843855 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843862 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843870 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843881 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843889 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843897 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843905 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843912 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843921 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843928 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843935 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843943 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843953 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843962 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843969 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843976 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843984 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.843992 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.844001 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.844009 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.844019 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.844029 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844195 4763 flags.go:64] FLAG: --address="0.0.0.0" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844211 4763 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844225 4763 flags.go:64] FLAG: --anonymous-auth="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844236 4763 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844247 4763 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844256 4763 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844267 4763 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844278 4763 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844287 4763 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844296 4763 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844306 4763 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844315 4763 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844324 4763 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844335 4763 flags.go:64] FLAG: --cgroup-root="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844345 4763 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844354 4763 flags.go:64] FLAG: --client-ca-file="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844363 4763 flags.go:64] FLAG: --cloud-config="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844373 4763 flags.go:64] FLAG: --cloud-provider="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844382 4763 flags.go:64] FLAG: --cluster-dns="[]" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844393 4763 flags.go:64] FLAG: --cluster-domain="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844401 4763 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844411 4763 flags.go:64] FLAG: --config-dir="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844420 4763 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844430 4763 flags.go:64] FLAG: --container-log-max-files="5" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844441 4763 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844450 4763 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844487 4763 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844497 4763 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844506 4763 flags.go:64] FLAG: --contention-profiling="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844515 4763 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844524 4763 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844533 4763 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844542 4763 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844553 4763 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844563 4763 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844572 4763 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844580 4763 flags.go:64] FLAG: --enable-load-reader="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844589 4763 flags.go:64] FLAG: --enable-server="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844598 4763 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844609 4763 flags.go:64] FLAG: --event-burst="100" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844618 4763 flags.go:64] FLAG: --event-qps="50" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844627 4763 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844635 4763 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844644 4763 flags.go:64] FLAG: --eviction-hard="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844656 4763 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844665 4763 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844674 4763 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844683 4763 flags.go:64] FLAG: --eviction-soft="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844692 4763 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844701 4763 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844710 4763 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844718 4763 flags.go:64] FLAG: --experimental-mounter-path="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844727 4763 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844737 4763 flags.go:64] FLAG: --fail-swap-on="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844746 4763 flags.go:64] FLAG: --feature-gates="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844757 4763 flags.go:64] FLAG: --file-check-frequency="20s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844766 4763 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844775 4763 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844784 4763 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844793 4763 flags.go:64] FLAG: --healthz-port="10248" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844803 4763 flags.go:64] FLAG: --help="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844812 4763 flags.go:64] FLAG: --hostname-override="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844820 4763 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844829 4763 flags.go:64] FLAG: --http-check-frequency="20s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844838 4763 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844847 4763 flags.go:64] FLAG: --image-credential-provider-config="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844856 4763 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844864 4763 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844873 4763 flags.go:64] FLAG: --image-service-endpoint="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844882 4763 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844891 4763 flags.go:64] FLAG: --kube-api-burst="100" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844900 4763 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844909 4763 flags.go:64] FLAG: --kube-api-qps="50" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844917 4763 flags.go:64] FLAG: --kube-reserved="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844926 4763 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844935 4763 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844944 4763 flags.go:64] FLAG: --kubelet-cgroups="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844953 4763 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844965 4763 flags.go:64] FLAG: --lock-file="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844974 4763 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844984 4763 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.844993 4763 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845007 4763 flags.go:64] FLAG: --log-json-split-stream="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845016 4763 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845025 4763 flags.go:64] FLAG: --log-text-split-stream="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845034 4763 flags.go:64] FLAG: --logging-format="text" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845042 4763 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845052 4763 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845061 4763 flags.go:64] FLAG: --manifest-url="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845070 4763 flags.go:64] FLAG: --manifest-url-header="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845082 4763 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845091 4763 flags.go:64] FLAG: --max-open-files="1000000" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845102 4763 flags.go:64] FLAG: --max-pods="110" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845111 4763 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845120 4763 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845128 4763 flags.go:64] FLAG: --memory-manager-policy="None" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845138 4763 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845147 4763 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845156 4763 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845165 4763 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845183 4763 flags.go:64] FLAG: --node-status-max-images="50" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845192 4763 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845201 4763 flags.go:64] FLAG: --oom-score-adj="-999" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845210 4763 flags.go:64] FLAG: --pod-cidr="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845218 4763 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845231 4763 flags.go:64] FLAG: --pod-manifest-path="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845240 4763 flags.go:64] FLAG: --pod-max-pids="-1" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845249 4763 flags.go:64] FLAG: --pods-per-core="0" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845258 4763 flags.go:64] FLAG: --port="10250" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845292 4763 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845302 4763 flags.go:64] FLAG: --provider-id="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845315 4763 flags.go:64] FLAG: --qos-reserved="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845326 4763 flags.go:64] FLAG: --read-only-port="10255" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845337 4763 flags.go:64] FLAG: --register-node="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845348 4763 flags.go:64] FLAG: --register-schedulable="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845357 4763 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845372 4763 flags.go:64] FLAG: --registry-burst="10" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845381 4763 flags.go:64] FLAG: --registry-qps="5" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845390 4763 flags.go:64] FLAG: --reserved-cpus="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845398 4763 flags.go:64] FLAG: --reserved-memory="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845409 4763 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845418 4763 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845427 4763 flags.go:64] FLAG: --rotate-certificates="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845436 4763 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845445 4763 flags.go:64] FLAG: --runonce="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845480 4763 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845490 4763 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845500 4763 flags.go:64] FLAG: --seccomp-default="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845509 4763 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845518 4763 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845527 4763 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845536 4763 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845545 4763 flags.go:64] FLAG: --storage-driver-password="root" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845554 4763 flags.go:64] FLAG: --storage-driver-secure="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845563 4763 flags.go:64] FLAG: --storage-driver-table="stats" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845573 4763 flags.go:64] FLAG: --storage-driver-user="root" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845582 4763 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845592 4763 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845602 4763 flags.go:64] FLAG: --system-cgroups="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845610 4763 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845624 4763 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845632 4763 flags.go:64] FLAG: --tls-cert-file="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845641 4763 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845653 4763 flags.go:64] FLAG: --tls-min-version="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845662 4763 flags.go:64] FLAG: --tls-private-key-file="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845671 4763 flags.go:64] FLAG: --topology-manager-policy="none" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845680 4763 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845688 4763 flags.go:64] FLAG: --topology-manager-scope="container" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845697 4763 flags.go:64] FLAG: --v="2" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845708 4763 flags.go:64] FLAG: --version="false" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845719 4763 flags.go:64] FLAG: --vmodule="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845729 4763 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.845738 4763 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.845949 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.845961 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.845971 4763 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.845979 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.845987 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.845996 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846003 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846011 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846020 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846028 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846036 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846044 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846052 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846059 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846068 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846075 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846083 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846091 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846099 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846107 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846114 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846123 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846131 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846138 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846146 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846154 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846163 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846171 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846179 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846187 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846195 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846203 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846211 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846218 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846226 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846233 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846241 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846248 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846256 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846263 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846274 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846284 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846293 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846301 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846310 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846319 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846327 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846336 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846344 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846351 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846359 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846367 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846375 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846382 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846393 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846402 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846410 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846418 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846427 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846434 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846442 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846449 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846481 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846490 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846498 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846509 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846519 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846527 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846535 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846543 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.846550 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.846831 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.859408 4763 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.859492 4763 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859593 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859606 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859611 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859617 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859622 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859631 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859637 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859643 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859647 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859652 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859656 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859661 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859665 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859671 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859675 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859680 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859684 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859689 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859693 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859698 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859702 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859707 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859711 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859716 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859721 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859725 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859730 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859738 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859746 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859752 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859758 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859766 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859772 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859777 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859783 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859788 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859793 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859799 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859805 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859810 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859814 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859817 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859822 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859826 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859830 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859835 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859840 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859844 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859849 4763 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859854 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859858 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859863 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859870 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859875 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859880 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859885 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859891 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859898 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859904 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859911 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859916 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859924 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859929 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859934 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859939 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859943 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859948 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859953 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859958 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859962 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.859966 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.859974 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860149 4763 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860160 4763 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860165 4763 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860170 4763 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860175 4763 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860179 4763 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860184 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860189 4763 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860193 4763 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860197 4763 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860202 4763 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860206 4763 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860212 4763 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860220 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860225 4763 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860230 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860234 4763 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860239 4763 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860243 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860248 4763 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860252 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860256 4763 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860261 4763 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860265 4763 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860269 4763 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860274 4763 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860278 4763 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860284 4763 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860289 4763 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860294 4763 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860299 4763 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860303 4763 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860307 4763 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860312 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860316 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860320 4763 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860324 4763 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860330 4763 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860336 4763 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860342 4763 feature_gate.go:330] unrecognized feature gate: Example Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860347 4763 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860352 4763 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860357 4763 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860361 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860365 4763 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860370 4763 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860375 4763 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860379 4763 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860383 4763 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860388 4763 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860392 4763 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860395 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860399 4763 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860402 4763 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860406 4763 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860409 4763 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860414 4763 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860423 4763 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860427 4763 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860431 4763 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860434 4763 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860438 4763 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860441 4763 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860444 4763 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860448 4763 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860469 4763 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860473 4763 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860477 4763 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860481 4763 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860484 4763 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.860490 4763 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.860498 4763 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.860704 4763 server.go:940] "Client rotation is on, will bootstrap in background" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.864052 4763 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.864171 4763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.865304 4763 server.go:997] "Starting client certificate rotation" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.865331 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.865497 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-09 13:35:58.855300372 +0000 UTC Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.865615 4763 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 940h21m15.989691414s for next certificate rotation Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.871043 4763 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.874603 4763 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.884848 4763 log.go:25] "Validated CRI v1 runtime API" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.897953 4763 log.go:25] "Validated CRI v1 image API" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.900448 4763 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.904022 4763 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-01-09-08-44-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.904082 4763 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.920007 4763 manager.go:217] Machine: {Timestamp:2025-12-01 09:14:42.918815671 +0000 UTC m=+0.187464479 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0f5eae23-6db1-423b-9ba3-36ae34520ea2 BootID:c0bd43ec-2730-494c-91aa-feba284cbe79 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:69:ca:24 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:69:ca:24 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:24:f5:f5 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:47:84:da Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:90:be:13 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:32:62:28 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:1d:0d:75 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:02:b5:49:9e:9f:ea Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:8e:ad:ba:2e:e2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.920261 4763 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.920536 4763 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.921085 4763 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.921359 4763 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.921402 4763 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.921686 4763 topology_manager.go:138] "Creating topology manager with none policy" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.921702 4763 container_manager_linux.go:303] "Creating device plugin manager" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.921977 4763 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.922015 4763 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.922234 4763 state_mem.go:36] "Initialized new in-memory state store" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.922330 4763 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.922976 4763 kubelet.go:418] "Attempting to sync node with API server" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.923000 4763 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.923027 4763 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.923042 4763 kubelet.go:324] "Adding apiserver pod source" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.923055 4763 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.925799 4763 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.926374 4763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.926595 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:42 crc kubenswrapper[4763]: E1201 09:14:42.926894 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.926873 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:42 crc kubenswrapper[4763]: E1201 09:14:42.926954 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.928448 4763 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929184 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929217 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929231 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929244 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929264 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929317 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929332 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929351 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929365 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929379 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929415 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.929429 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.930021 4763 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.930643 4763 server.go:1280] "Started kubelet" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.930948 4763 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.931050 4763 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 01 09:14:42 crc kubenswrapper[4763]: E1201 09:14:42.932348 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.199:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d0c999eaa470e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:14:42.930591502 +0000 UTC m=+0.199240280,LastTimestamp:2025-12-01 09:14:42.930591502 +0000 UTC m=+0.199240280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:14:42 crc systemd[1]: Started Kubernetes Kubelet. Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.932966 4763 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.931666 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.934320 4763 server.go:460] "Adding debug handlers to kubelet server" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.935959 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.935988 4763 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.936413 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 11:21:30.495486699 +0000 UTC Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.938796 4763 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.938836 4763 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.938975 4763 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.939737 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:42 crc kubenswrapper[4763]: E1201 09:14:42.939825 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:14:42 crc kubenswrapper[4763]: E1201 09:14:42.940048 4763 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.940078 4763 factory.go:55] Registering systemd factory Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.940339 4763 factory.go:221] Registration of the systemd container factory successfully Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.944498 4763 factory.go:153] Registering CRI-O factory Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.944535 4763 factory.go:221] Registration of the crio container factory successfully Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.944633 4763 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.944660 4763 factory.go:103] Registering Raw factory Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.944679 4763 manager.go:1196] Started watching for new ooms in manager Dec 01 09:14:42 crc kubenswrapper[4763]: E1201 09:14:42.944784 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="200ms" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.945327 4763 manager.go:319] Starting recovery of all containers Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958723 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958780 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958804 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958815 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958827 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958839 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958852 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958862 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958876 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958886 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958896 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958905 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958915 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958950 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958961 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958971 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.958981 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959648 4763 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959671 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959683 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959694 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959704 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959715 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959726 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959740 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959750 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959762 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959788 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959804 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959821 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959835 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959849 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959864 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959880 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959911 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959922 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959934 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959944 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959955 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959966 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959978 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.959990 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960004 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960016 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960029 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960042 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960071 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960083 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960095 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960108 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960119 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960132 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960142 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960159 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960175 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960185 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960195 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960206 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960217 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960227 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960237 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960247 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960258 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960268 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960277 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960288 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960297 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960307 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960317 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960327 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960338 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960348 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960357 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960403 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960414 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960425 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960435 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960446 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960470 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960480 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960489 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960499 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960509 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960518 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960526 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960535 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960547 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960560 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960571 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960580 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960590 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960599 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960610 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960620 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960630 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960641 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960653 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960662 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960672 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960683 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960695 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960706 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960717 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960728 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960738 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960757 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960769 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960779 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960793 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960803 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960813 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960851 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960863 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960874 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960884 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960894 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960903 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960912 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960930 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960940 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960951 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960962 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960972 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960982 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.960991 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961003 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961013 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961024 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961034 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961044 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961055 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961065 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961074 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961084 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961096 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961106 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961116 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961126 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961135 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961151 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961161 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961171 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961181 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961191 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961200 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961210 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961219 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961229 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961240 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961250 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961261 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961270 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961308 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961319 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961328 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961339 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961352 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961362 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961370 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961379 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961388 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961398 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961408 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961417 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961427 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961435 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961445 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961470 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961481 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961490 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961500 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961510 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961519 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961528 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961539 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961550 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961561 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961570 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961579 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961588 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961597 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961606 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961615 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961624 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961633 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961643 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961652 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961663 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961672 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961681 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961691 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961702 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961711 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961720 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961731 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961740 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961749 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961759 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961769 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961779 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961788 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961799 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961809 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.961820 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.962024 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.962034 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.962044 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.962058 4763 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.962067 4763 reconstruct.go:97] "Volume reconstruction finished" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.962075 4763 reconciler.go:26] "Reconciler: start to sync state" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.976878 4763 manager.go:324] Recovery completed Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.990387 4763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.991175 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.992748 4763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.992786 4763 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.992813 4763 kubelet.go:2335] "Starting kubelet main sync loop" Dec 01 09:14:42 crc kubenswrapper[4763]: E1201 09:14:42.992859 4763 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 01 09:14:42 crc kubenswrapper[4763]: W1201 09:14:42.993951 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:42 crc kubenswrapper[4763]: E1201 09:14:42.994006 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.996440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.996501 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.996512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.997280 4763 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.997311 4763 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 01 09:14:42 crc kubenswrapper[4763]: I1201 09:14:42.997413 4763 state_mem.go:36] "Initialized new in-memory state store" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.006603 4763 policy_none.go:49] "None policy: Start" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.008024 4763 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.008051 4763 state_mem.go:35] "Initializing new in-memory state store" Dec 01 09:14:43 crc kubenswrapper[4763]: E1201 09:14:43.040840 4763 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.074175 4763 manager.go:334] "Starting Device Plugin manager" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.074252 4763 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.074272 4763 server.go:79] "Starting device plugin registration server" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.074900 4763 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.074930 4763 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.075092 4763 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.075246 4763 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.075263 4763 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 01 09:14:43 crc kubenswrapper[4763]: E1201 09:14:43.083184 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.093131 4763 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.093242 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.094717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.094772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.094799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.094984 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.095144 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.095203 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.096382 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.096419 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.096434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.097047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.097073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.097084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.097174 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.097607 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.097637 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.097955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.098006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.098019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.098216 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.098297 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.098340 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.098536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.098577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.098587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099489 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099643 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.099723 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.101142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.101184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.101197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.101598 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.101648 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.103431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.103481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.103497 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.103515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.103502 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.103589 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: E1201 09:14:43.146049 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="400ms" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165038 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165074 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165166 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165200 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165222 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165258 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165276 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165331 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165371 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165398 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165434 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.165538 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.175066 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.176319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.176341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.176364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.176385 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:14:43 crc kubenswrapper[4763]: E1201 09:14:43.176808 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266257 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266316 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266354 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266375 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266391 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266405 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266434 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266501 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266516 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266532 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266546 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.266943 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267044 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267053 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267078 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267084 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267101 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267157 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267141 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267147 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267161 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267181 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.267410 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.377679 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.378929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.378972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.378992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.379032 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:14:43 crc kubenswrapper[4763]: E1201 09:14:43.379478 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.430167 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.451688 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: W1201 09:14:43.458761 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-38f10b628075dc39a8861b445a66444b4311d829a3df2804cc0814de86112a8a WatchSource:0}: Error finding container 38f10b628075dc39a8861b445a66444b4311d829a3df2804cc0814de86112a8a: Status 404 returned error can't find the container with id 38f10b628075dc39a8861b445a66444b4311d829a3df2804cc0814de86112a8a Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.473181 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: W1201 09:14:43.477726 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-6096e91eaaf4dbf334bf1cd0be91d09f2a0508093b6f6f5f86b563fcb15b7838 WatchSource:0}: Error finding container 6096e91eaaf4dbf334bf1cd0be91d09f2a0508093b6f6f5f86b563fcb15b7838: Status 404 returned error can't find the container with id 6096e91eaaf4dbf334bf1cd0be91d09f2a0508093b6f6f5f86b563fcb15b7838 Dec 01 09:14:43 crc kubenswrapper[4763]: W1201 09:14:43.485972 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-bebd591222ec1c9580da7904dae3061c1357ccadd1d77b80979cb8efed514253 WatchSource:0}: Error finding container bebd591222ec1c9580da7904dae3061c1357ccadd1d77b80979cb8efed514253: Status 404 returned error can't find the container with id bebd591222ec1c9580da7904dae3061c1357ccadd1d77b80979cb8efed514253 Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.499759 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.506587 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 01 09:14:43 crc kubenswrapper[4763]: W1201 09:14:43.513444 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-dd5be85ad1385b199e1e56bcfac50e59d544e6e1bc3cb7bb72d965a9c19bbd67 WatchSource:0}: Error finding container dd5be85ad1385b199e1e56bcfac50e59d544e6e1bc3cb7bb72d965a9c19bbd67: Status 404 returned error can't find the container with id dd5be85ad1385b199e1e56bcfac50e59d544e6e1bc3cb7bb72d965a9c19bbd67 Dec 01 09:14:43 crc kubenswrapper[4763]: W1201 09:14:43.525738 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-cace735f5dae32102dea5f29c7f4766dc7f976e556c6083ccfa59532ff633d33 WatchSource:0}: Error finding container cace735f5dae32102dea5f29c7f4766dc7f976e556c6083ccfa59532ff633d33: Status 404 returned error can't find the container with id cace735f5dae32102dea5f29c7f4766dc7f976e556c6083ccfa59532ff633d33 Dec 01 09:14:43 crc kubenswrapper[4763]: E1201 09:14:43.547403 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="800ms" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.780193 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.782122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.782165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.782176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.782203 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:14:43 crc kubenswrapper[4763]: E1201 09:14:43.782650 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Dec 01 09:14:43 crc kubenswrapper[4763]: W1201 09:14:43.899811 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:43 crc kubenswrapper[4763]: E1201 09:14:43.899911 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.934039 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.937068 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 18:53:43.499296907 +0000 UTC Dec 01 09:14:43 crc kubenswrapper[4763]: I1201 09:14:43.937126 4763 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 9h38m59.562180338s for next certificate rotation Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.001804 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="205ccedd626fcef9e39bc25285230cb42723a43d3dc518ffb728916551ca83bd" exitCode=0 Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.002136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"205ccedd626fcef9e39bc25285230cb42723a43d3dc518ffb728916551ca83bd"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.002249 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cace735f5dae32102dea5f29c7f4766dc7f976e556c6083ccfa59532ff633d33"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.002394 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.006127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.006178 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.006192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.008569 4763 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="731be123e5d1462afce7d9801d18622af2e6fcaf0ed2e6e33379084740c5ff0b" exitCode=0 Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.008660 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"731be123e5d1462afce7d9801d18622af2e6fcaf0ed2e6e33379084740c5ff0b"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.008692 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dd5be85ad1385b199e1e56bcfac50e59d544e6e1bc3cb7bb72d965a9c19bbd67"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.008765 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.010166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.010198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.010207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.012727 4763 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f" exitCode=0 Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.012811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.012851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bebd591222ec1c9580da7904dae3061c1357ccadd1d77b80979cb8efed514253"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.012964 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.014032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.014071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.014084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.014628 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.014673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6096e91eaaf4dbf334bf1cd0be91d09f2a0508093b6f6f5f86b563fcb15b7838"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.016819 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c" exitCode=0 Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.016854 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.016877 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"38f10b628075dc39a8861b445a66444b4311d829a3df2804cc0814de86112a8a"} Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.016976 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.017796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.017826 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.017837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.022815 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.023919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.023995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.024009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:44 crc kubenswrapper[4763]: E1201 09:14:44.350256 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="1.6s" Dec 01 09:14:44 crc kubenswrapper[4763]: W1201 09:14:44.377177 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:44 crc kubenswrapper[4763]: E1201 09:14:44.377262 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:14:44 crc kubenswrapper[4763]: W1201 09:14:44.377329 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:44 crc kubenswrapper[4763]: E1201 09:14:44.377381 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:14:44 crc kubenswrapper[4763]: W1201 09:14:44.452969 4763 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.199:6443: connect: connection refused Dec 01 09:14:44 crc kubenswrapper[4763]: E1201 09:14:44.453085 4763 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.199:6443: connect: connection refused" logger="UnhandledError" Dec 01 09:14:44 crc kubenswrapper[4763]: E1201 09:14:44.483301 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.199:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d0c999eaa470e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:14:42.930591502 +0000 UTC m=+0.199240280,LastTimestamp:2025-12-01 09:14:42.930591502 +0000 UTC m=+0.199240280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.582933 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.595803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.595838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.595848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:44 crc kubenswrapper[4763]: I1201 09:14:44.595871 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:14:44 crc kubenswrapper[4763]: E1201 09:14:44.596392 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.199:6443: connect: connection refused" node="crc" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.021141 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c1ab7ef2bcb53a6cde7774272cbd2db9ee0e3727b2bbef195dbf3fcec8d38d65" exitCode=0 Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.021235 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c1ab7ef2bcb53a6cde7774272cbd2db9ee0e3727b2bbef195dbf3fcec8d38d65"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.021443 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.022928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.022973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.022989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.024775 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0186133c8108a0a858362e4260a1fdd7954fd0cbe66c1ed37054373ef0ea131b"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.024870 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.025486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.025516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.025527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.027241 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.027281 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.027297 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.027393 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.031377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.031422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.031432 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.033581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.033622 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.033644 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.033658 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.034358 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.034394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.034408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.037622 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.037669 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.037680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.037689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644"} Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.056201 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.699509 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:45 crc kubenswrapper[4763]: I1201 09:14:45.705926 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.041954 4763 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8e6d1fba34eeaf835d549514c348b57b53be7e392f936e8268010c9f7b37f2d4" exitCode=0 Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.042074 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8e6d1fba34eeaf835d549514c348b57b53be7e392f936e8268010c9f7b37f2d4"} Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.043688 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.044741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.044781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.044790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.046547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4"} Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.046579 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.046691 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.047288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.047320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.047336 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.047600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.047625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.047634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.197156 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.198994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.199041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.199054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:46 crc kubenswrapper[4763]: I1201 09:14:46.199086 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.053975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"59f8586d833e611cc11a602fbf39f54ed80ccd44ab26f37e78218ac5173bdb63"} Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.054031 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ae2e9c9d8cdfe47c478030163a4933523b03d7d30b72055e2337738d6f1ff329"} Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.054048 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f966b308848ea0e167479cf835baa54ff25a3a3f4605936601cb1b3b63cb8758"} Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.054058 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"983a34ec227d8d505c94d85fec440ad0c4a0362c75ce47921a771b3e847130f5"} Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.054078 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.054117 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.054961 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.055026 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.055327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.055362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.055375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.057976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.058012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.058024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:47 crc kubenswrapper[4763]: I1201 09:14:47.494656 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.056504 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.056605 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.060838 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cc9da6b09553e9e79a074eb98fac386ba3cedf2dcd98fd15d9b91c22bc9bbd9a"} Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.060882 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.061088 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.061831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.061854 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.061864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.062488 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.062532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.062546 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.669603 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.669866 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.671424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.671485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:48 crc kubenswrapper[4763]: I1201 09:14:48.671495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.062928 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.063131 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.064343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.064420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.064430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.064489 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.064506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.064519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.118352 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.254433 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.496153 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.496351 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.497954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.498020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:49 crc kubenswrapper[4763]: I1201 09:14:49.498035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:50 crc kubenswrapper[4763]: I1201 09:14:50.067644 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:50 crc kubenswrapper[4763]: I1201 09:14:50.068355 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:50 crc kubenswrapper[4763]: I1201 09:14:50.070643 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:50 crc kubenswrapper[4763]: I1201 09:14:50.070701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:50 crc kubenswrapper[4763]: I1201 09:14:50.070738 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:50 crc kubenswrapper[4763]: I1201 09:14:50.070749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:50 crc kubenswrapper[4763]: I1201 09:14:50.070802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:50 crc kubenswrapper[4763]: I1201 09:14:50.070819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:51 crc kubenswrapper[4763]: I1201 09:14:51.696921 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 01 09:14:51 crc kubenswrapper[4763]: I1201 09:14:51.697209 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:51 crc kubenswrapper[4763]: I1201 09:14:51.698938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:51 crc kubenswrapper[4763]: I1201 09:14:51.698998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:51 crc kubenswrapper[4763]: I1201 09:14:51.699010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:52 crc kubenswrapper[4763]: I1201 09:14:52.237198 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:52 crc kubenswrapper[4763]: I1201 09:14:52.237535 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:52 crc kubenswrapper[4763]: I1201 09:14:52.239230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:52 crc kubenswrapper[4763]: I1201 09:14:52.239287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:52 crc kubenswrapper[4763]: I1201 09:14:52.239299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:53 crc kubenswrapper[4763]: E1201 09:14:53.083380 4763 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 01 09:14:54 crc kubenswrapper[4763]: I1201 09:14:54.936177 4763 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 01 09:14:55 crc kubenswrapper[4763]: I1201 09:14:55.600499 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 09:14:55 crc kubenswrapper[4763]: I1201 09:14:55.600565 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 09:14:55 crc kubenswrapper[4763]: I1201 09:14:55.612670 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 09:14:55 crc kubenswrapper[4763]: I1201 09:14:55.612786 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 09:14:58 crc kubenswrapper[4763]: I1201 09:14:58.057394 4763 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:14:58 crc kubenswrapper[4763]: I1201 09:14:58.057523 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.125990 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.126247 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.127528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.127572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.127586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.133028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.278802 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.279021 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.280242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.280300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.280313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.291950 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.501428 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.501618 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.502954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.502984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:14:59 crc kubenswrapper[4763]: I1201 09:14:59.502992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.093666 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.093703 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.093717 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.094832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.094963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.095050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.094832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.095184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.095205 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:00 crc kubenswrapper[4763]: E1201 09:15:00.596107 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.597646 4763 trace.go:236] Trace[756193231]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:14:46.966) (total time: 13630ms): Dec 01 09:15:00 crc kubenswrapper[4763]: Trace[756193231]: ---"Objects listed" error: 13630ms (09:15:00.597) Dec 01 09:15:00 crc kubenswrapper[4763]: Trace[756193231]: [13.630943501s] [13.630943501s] END Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.597675 4763 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.598449 4763 trace.go:236] Trace[521346026]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:14:46.649) (total time: 13949ms): Dec 01 09:15:00 crc kubenswrapper[4763]: Trace[521346026]: ---"Objects listed" error: 13949ms (09:15:00.598) Dec 01 09:15:00 crc kubenswrapper[4763]: Trace[521346026]: [13.949158406s] [13.949158406s] END Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.598607 4763 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.598695 4763 trace.go:236] Trace[1487774734]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:14:47.220) (total time: 13377ms): Dec 01 09:15:00 crc kubenswrapper[4763]: Trace[1487774734]: ---"Objects listed" error: 13377ms (09:15:00.598) Dec 01 09:15:00 crc kubenswrapper[4763]: Trace[1487774734]: [13.377815921s] [13.377815921s] END Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.598719 4763 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.598904 4763 trace.go:236] Trace[1088488791]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Dec-2025 09:14:46.553) (total time: 14045ms): Dec 01 09:15:00 crc kubenswrapper[4763]: Trace[1088488791]: ---"Objects listed" error: 14045ms (09:15:00.598) Dec 01 09:15:00 crc kubenswrapper[4763]: Trace[1088488791]: [14.04567017s] [14.04567017s] END Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.598926 4763 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.600368 4763 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 01 09:15:00 crc kubenswrapper[4763]: E1201 09:15:00.601020 4763 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.798703 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39932->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.798760 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:39932->192.168.126.11:17697: read: connection reset by peer" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.798762 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54248->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.798868 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54248->192.168.126.11:17697: read: connection reset by peer" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.799064 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.799105 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.933534 4763 apiserver.go:52] "Watching apiserver" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.939076 4763 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.939417 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.939983 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:00 crc kubenswrapper[4763]: E1201 09:15:00.940049 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.940102 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.940387 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:00 crc kubenswrapper[4763]: E1201 09:15:00.940438 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.940516 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:00 crc kubenswrapper[4763]: E1201 09:15:00.940542 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.940587 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.940864 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.942127 4763 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.949558 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.949571 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.949702 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.949777 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.949782 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.949862 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.949957 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.950162 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.951216 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 09:15:00 crc kubenswrapper[4763]: I1201 09:15:00.982172 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.002267 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.002601 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.002710 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.002804 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.002906 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003023 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003127 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003229 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003323 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003421 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.002797 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.002952 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003109 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003240 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003541 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003709 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003737 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003767 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003793 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003815 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003838 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003902 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003926 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003969 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003999 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004020 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004062 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004104 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004128 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004177 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004202 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004228 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004250 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004273 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004295 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004318 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004373 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004396 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004416 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004437 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004485 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004509 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004532 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004554 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004576 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004620 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004641 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004663 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004688 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004710 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004732 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004757 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004779 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004802 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004849 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004873 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004896 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004919 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004943 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004991 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005014 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005035 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005120 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005145 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005167 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005186 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005230 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005254 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005278 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005305 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005330 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005353 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005378 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005402 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005424 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005448 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005493 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005623 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005650 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005699 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005723 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005748 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005791 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005817 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005840 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005864 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005903 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005929 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005953 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005984 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006007 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006030 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006053 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006075 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006143 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006164 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006186 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.003923 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006208 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004642 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004802 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004897 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.004954 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005060 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005231 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005409 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005546 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005726 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006300 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005771 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005884 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.005917 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006066 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006195 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006574 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006694 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006781 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006840 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006945 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006956 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.007157 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.007286 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.007332 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.007387 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.007798 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.008750 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.008900 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.008910 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009230 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009253 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009264 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.006232 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009326 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009349 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009369 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009386 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009403 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.009419 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.010694 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.011092 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.011129 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.011152 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.011173 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.011196 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.011218 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.011290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.012087 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.012337 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.012550 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.012825 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.012991 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.013566 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.013998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.014284 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.014644 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.015040 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.015254 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.015573 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.015568 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.016030 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.016038 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.020715 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.022739 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.022928 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.023185 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.023581 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024105 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024473 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.011408 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024562 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024586 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024622 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024646 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024666 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024692 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024714 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024733 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024754 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024776 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024766 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024797 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024823 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024844 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.024968 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025119 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025209 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025242 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025269 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025291 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025309 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025331 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025354 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025373 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025393 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025417 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025439 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025483 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025479 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025507 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025533 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025555 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025580 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025605 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025662 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025690 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025732 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025749 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025770 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025790 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025809 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025830 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025868 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025887 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025906 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025935 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025960 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.025986 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026015 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026031 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026055 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026080 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026105 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026118 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026133 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026164 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026186 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026193 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026206 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026289 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026331 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026469 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.026490 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.027022 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.027503 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.027843 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.027978 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.028925 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.029240 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.029981 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.032985 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.033264 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.034225 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.029691 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.034703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.035902 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.038382 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.039400 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.039629 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.039821 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.040352 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.040341 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.041429 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.041807 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.042184 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.042515 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.042659 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.042835 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.044237 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.045257 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.045536 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.045564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.045843 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.046077 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.046313 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.047171 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.047529 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.048730 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.049265 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.049361 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.049806 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.049804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.050083 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.050112 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.050364 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.050538 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.050689 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.050725 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.051169 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.051332 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.051474 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.051742 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.052009 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.052052 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.052217 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.052369 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.052493 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.052720 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.052759 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.052858 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:15:01.552826535 +0000 UTC m=+18.821475513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.053179 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.054212 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.054904 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.055242 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.055981 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.056087 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.056219 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.056803 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.057153 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.057343 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.057704 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.057831 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.058001 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.058144 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.058421 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.058816 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.058820 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.058851 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.059666 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.059694 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.029315 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.059968 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.060137 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.062737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.062731 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.064039 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.064668 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.064663 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.064680 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.064848 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.066242 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.066655 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.066977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.067091 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.067701 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.068560 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.068745 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.068953 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.068986 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069287 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.032567 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.032661 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069379 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069404 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069423 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069440 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069481 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069489 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069573 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069632 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069664 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069692 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069719 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069743 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069772 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069821 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069851 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069875 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069896 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.069968 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070001 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070140 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070260 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070284 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070286 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070303 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070426 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070642 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070665 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070724 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070920 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070942 4763 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070954 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070964 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070978 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070988 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.070999 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071009 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071019 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071028 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071037 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071046 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071055 4763 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071064 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071075 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071085 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071096 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071105 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071113 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071122 4763 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071131 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071155 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071164 4763 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071174 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071184 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071194 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071204 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071213 4763 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071221 4763 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071230 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071242 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071250 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071259 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071268 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071277 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071285 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071294 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071331 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071303 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071442 4763 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071481 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071496 4763 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071526 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071538 4763 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071549 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071560 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071632 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071643 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071658 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071669 4763 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071707 4763 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071720 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071732 4763 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071744 4763 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071760 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071785 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071796 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071805 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071817 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071828 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071841 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071877 4763 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071888 4763 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071896 4763 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071906 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071916 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071988 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071999 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072009 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072046 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072056 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072065 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072074 4763 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072084 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072093 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072102 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072133 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072141 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072151 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072161 4763 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072185 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072209 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072218 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072227 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072236 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072245 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072260 4763 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072294 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072303 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072313 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072322 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072331 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072340 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072363 4763 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072374 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072382 4763 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072391 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072400 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072409 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072418 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072439 4763 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072527 4763 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072537 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072546 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072557 4763 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072566 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072575 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072583 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072607 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072617 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072626 4763 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072636 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072645 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072654 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072662 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072686 4763 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072695 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072704 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072714 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072723 4763 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072732 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072740 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072767 4763 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072779 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072788 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072797 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072806 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072817 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072846 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072857 4763 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072866 4763 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072982 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073010 4763 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073021 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073030 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073054 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073063 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073072 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073081 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073090 4763 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073099 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073109 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073132 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073141 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073150 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073159 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073168 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073177 4763 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073186 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073210 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073221 4763 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073229 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073238 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073248 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073257 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073266 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073291 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073300 4763 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073310 4763 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073321 4763 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073331 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073340 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073365 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073375 4763 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073384 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073395 4763 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073404 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073413 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073422 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073446 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.071494 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073550 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.073574 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:01.573546949 +0000 UTC m=+18.842195717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.073781 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.074117 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.074484 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.074713 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.075018 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.074571 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.075413 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.075612 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.075699 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.075860 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.076449 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.076434 4763 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.071945 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.072285 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.079891 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.080826 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.083413 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.084257 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.084383 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:01.584363384 +0000 UTC m=+18.853012152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.085413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.085624 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.089782 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.089968 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.090441 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.090855 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.096227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.097586 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.098166 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.100349 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4" exitCode=255 Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.100400 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4"} Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.104441 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.104639 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.104843 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.105102 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:01.605076097 +0000 UTC m=+18.873725035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.114195 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.115369 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.126533 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.132965 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.135359 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.136045 4763 scope.go:117] "RemoveContainer" containerID="b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.136517 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.136612 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.136723 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:01.636693718 +0000 UTC m=+18.905342486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.137403 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.138145 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.139478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.141353 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.177748 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.177969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178032 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178043 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178053 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178063 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178072 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178082 4763 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178091 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178102 4763 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178112 4763 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178121 4763 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178131 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178140 4763 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178149 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178158 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178167 4763 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178178 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178191 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178202 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178213 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178223 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178235 4763 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178245 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178256 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178268 4763 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178280 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.178313 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.177941 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.204884 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.259882 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.268819 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.276865 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.278563 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.332109 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.419052 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.454742 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.483434 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.513101 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.532701 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.556299 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.583183 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.583282 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.583432 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.583529 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:02.583511341 +0000 UTC m=+19.852160109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.583600 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:15:02.583591243 +0000 UTC m=+19.852240011 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.684133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.684177 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.684197 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684266 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684324 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:02.684308372 +0000 UTC m=+19.952957140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684668 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684683 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684693 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684717 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:02.684710333 +0000 UTC m=+19.953359101 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684757 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684764 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684770 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.684789 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:02.684783385 +0000 UTC m=+19.953432153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.993603 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:01 crc kubenswrapper[4763]: I1201 09:15:01.993984 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.994188 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:01 crc kubenswrapper[4763]: E1201 09:15:01.994672 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.104081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445"} Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.104125 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"384de7c604473026c270bf3666cecc17a843d6ea0f9f1e04fe7374340429a08b"} Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.106466 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.108502 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b"} Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.109332 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.109491 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cb7dd3bf001826e5f7772a770e102a32428d52e7262fbf060f8524ddc377d5c6"} Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.111229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151"} Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.111251 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a"} Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.111261 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3200980552de3c1fe6b2df6810cf191e24b7d1193118489ad1cdb0b0f9f9be01"} Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.131432 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.158643 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.180395 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.196027 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-l5kgb"] Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.196680 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-tjks4"] Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.196798 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.197438 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-kcjjj"] Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.197544 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tjks4" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.198235 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-fr552"] Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.198379 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.199014 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.201500 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.201625 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.203223 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.203397 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.205760 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.208374 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.208961 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.209436 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.210858 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.211006 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.211114 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.211526 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.212086 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.214317 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.214650 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.214802 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.248999 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288687 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-os-release\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288729 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-hostroot\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288747 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-daemon-config\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288761 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8938414d-9ef1-478b-9633-e43890dd4540-hosts-file\") pod \"node-resolver-tjks4\" (UID: \"8938414d-9ef1-478b-9633-e43890dd4540\") " pod="openshift-dns/node-resolver-tjks4" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288779 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288810 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-system-cni-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288823 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-os-release\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-cni-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288853 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47f7352d-5a70-4ded-93bf-875ac4531bff-cni-binary-copy\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-netns\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288881 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-cni-multus\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288894 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-cni-bin\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288909 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f95ef452-7057-4afb-a8ca-1c505b953c2e-mcd-auth-proxy-config\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47f7352d-5a70-4ded-93bf-875ac4531bff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288940 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmg68\" (UniqueName: \"kubernetes.io/projected/47f7352d-5a70-4ded-93bf-875ac4531bff-kube-api-access-lmg68\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95ef452-7057-4afb-a8ca-1c505b953c2e-proxy-tls\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288973 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnpt7\" (UniqueName: \"kubernetes.io/projected/f95ef452-7057-4afb-a8ca-1c505b953c2e-kube-api-access-bnpt7\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.288995 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-cnibin\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289010 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-kubelet\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289037 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-multus-certs\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289054 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ptlx\" (UniqueName: \"kubernetes.io/projected/8938414d-9ef1-478b-9633-e43890dd4540-kube-api-access-4ptlx\") pod \"node-resolver-tjks4\" (UID: \"8938414d-9ef1-478b-9633-e43890dd4540\") " pod="openshift-dns/node-resolver-tjks4" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289070 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-socket-dir-parent\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289087 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhc7\" (UniqueName: \"kubernetes.io/projected/192e1ecd-fa1f-4227-a40c-4f7773682880-kube-api-access-xwhc7\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289100 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/192e1ecd-fa1f-4227-a40c-4f7773682880-cni-binary-copy\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-conf-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289126 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f95ef452-7057-4afb-a8ca-1c505b953c2e-rootfs\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289140 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-system-cni-dir\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289155 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-k8s-cni-cncf-io\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289174 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-etc-kubernetes\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.289189 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-cnibin\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.291646 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.325980 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.354974 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.373733 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390482 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-cnibin\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390539 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-kubelet\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390556 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-multus-certs\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-socket-dir-parent\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ptlx\" (UniqueName: \"kubernetes.io/projected/8938414d-9ef1-478b-9633-e43890dd4540-kube-api-access-4ptlx\") pod \"node-resolver-tjks4\" (UID: \"8938414d-9ef1-478b-9633-e43890dd4540\") " pod="openshift-dns/node-resolver-tjks4" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390605 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/192e1ecd-fa1f-4227-a40c-4f7773682880-cni-binary-copy\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhc7\" (UniqueName: \"kubernetes.io/projected/192e1ecd-fa1f-4227-a40c-4f7773682880-kube-api-access-xwhc7\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390637 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-system-cni-dir\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390660 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-conf-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f95ef452-7057-4afb-a8ca-1c505b953c2e-rootfs\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390690 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-k8s-cni-cncf-io\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390710 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-etc-kubernetes\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390724 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-cnibin\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-daemon-config\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390754 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8938414d-9ef1-478b-9633-e43890dd4540-hosts-file\") pod \"node-resolver-tjks4\" (UID: \"8938414d-9ef1-478b-9633-e43890dd4540\") " pod="openshift-dns/node-resolver-tjks4" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390769 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390782 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-os-release\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390795 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-hostroot\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-system-cni-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-os-release\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390855 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47f7352d-5a70-4ded-93bf-875ac4531bff-cni-binary-copy\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390870 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-cni-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-netns\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390899 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-cni-multus\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390915 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47f7352d-5a70-4ded-93bf-875ac4531bff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmg68\" (UniqueName: \"kubernetes.io/projected/47f7352d-5a70-4ded-93bf-875ac4531bff-kube-api-access-lmg68\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-cni-bin\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f95ef452-7057-4afb-a8ca-1c505b953c2e-mcd-auth-proxy-config\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390975 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95ef452-7057-4afb-a8ca-1c505b953c2e-proxy-tls\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.390990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnpt7\" (UniqueName: \"kubernetes.io/projected/f95ef452-7057-4afb-a8ca-1c505b953c2e-kube-api-access-bnpt7\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.391275 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-cnibin\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.391305 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-kubelet\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.391326 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-multus-certs\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.391357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-socket-dir-parent\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392023 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/192e1ecd-fa1f-4227-a40c-4f7773682880-cni-binary-copy\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392171 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-system-cni-dir\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392201 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-conf-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392225 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f95ef452-7057-4afb-a8ca-1c505b953c2e-rootfs\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-k8s-cni-cncf-io\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392266 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-etc-kubernetes\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-cnibin\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392567 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-cni-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392619 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-hostroot\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392720 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8938414d-9ef1-478b-9633-e43890dd4540-hosts-file\") pod \"node-resolver-tjks4\" (UID: \"8938414d-9ef1-478b-9633-e43890dd4540\") " pod="openshift-dns/node-resolver-tjks4" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/192e1ecd-fa1f-4227-a40c-4f7773682880-multus-daemon-config\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392776 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-run-netns\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.392797 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-cni-multus\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.393082 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47f7352d-5a70-4ded-93bf-875ac4531bff-cni-binary-copy\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.393103 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.393124 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-host-var-lib-cni-bin\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.393165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-os-release\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.393170 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47f7352d-5a70-4ded-93bf-875ac4531bff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.393204 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/192e1ecd-fa1f-4227-a40c-4f7773682880-system-cni-dir\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.393387 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47f7352d-5a70-4ded-93bf-875ac4531bff-os-release\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.393674 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f95ef452-7057-4afb-a8ca-1c505b953c2e-mcd-auth-proxy-config\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.400811 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.401031 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f95ef452-7057-4afb-a8ca-1c505b953c2e-proxy-tls\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.419067 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmg68\" (UniqueName: \"kubernetes.io/projected/47f7352d-5a70-4ded-93bf-875ac4531bff-kube-api-access-lmg68\") pod \"multus-additional-cni-plugins-kcjjj\" (UID: \"47f7352d-5a70-4ded-93bf-875ac4531bff\") " pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.424648 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ptlx\" (UniqueName: \"kubernetes.io/projected/8938414d-9ef1-478b-9633-e43890dd4540-kube-api-access-4ptlx\") pod \"node-resolver-tjks4\" (UID: \"8938414d-9ef1-478b-9633-e43890dd4540\") " pod="openshift-dns/node-resolver-tjks4" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.436890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnpt7\" (UniqueName: \"kubernetes.io/projected/f95ef452-7057-4afb-a8ca-1c505b953c2e-kube-api-access-bnpt7\") pod \"machine-config-daemon-l5kgb\" (UID: \"f95ef452-7057-4afb-a8ca-1c505b953c2e\") " pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.437685 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhc7\" (UniqueName: \"kubernetes.io/projected/192e1ecd-fa1f-4227-a40c-4f7773682880-kube-api-access-xwhc7\") pod \"multus-fr552\" (UID: \"192e1ecd-fa1f-4227-a40c-4f7773682880\") " pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.438266 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.456745 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.468861 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.482633 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.498820 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.510504 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.511057 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.517100 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tjks4" Dec 01 09:15:02 crc kubenswrapper[4763]: W1201 09:15:02.520713 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95ef452_7057_4afb_a8ca_1c505b953c2e.slice/crio-8fd0f482e20d9b070ac90d0ecbfd7a63b146fc3c24108710054fd7a4741898c9 WatchSource:0}: Error finding container 8fd0f482e20d9b070ac90d0ecbfd7a63b146fc3c24108710054fd7a4741898c9: Status 404 returned error can't find the container with id 8fd0f482e20d9b070ac90d0ecbfd7a63b146fc3c24108710054fd7a4741898c9 Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.524038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.532936 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fr552" Dec 01 09:15:02 crc kubenswrapper[4763]: W1201 09:15:02.533233 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8938414d_9ef1_478b_9633_e43890dd4540.slice/crio-ee5db5a0d7e5d51d008f6342ee8004acdac286492d2c246460971a755a446f40 WatchSource:0}: Error finding container ee5db5a0d7e5d51d008f6342ee8004acdac286492d2c246460971a755a446f40: Status 404 returned error can't find the container with id ee5db5a0d7e5d51d008f6342ee8004acdac286492d2c246460971a755a446f40 Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.537621 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.552752 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.593686 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.593779 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.593910 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.593955 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:04.593942278 +0000 UTC m=+21.862591046 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.594284 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:15:04.594273007 +0000 UTC m=+21.862921775 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.623869 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rpg27"] Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.625100 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: W1201 09:15:02.627731 4763 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.627792 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.630627 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.631154 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.631199 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.631345 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.631365 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.631394 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.653583 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.674427 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.694936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzr9g\" (UniqueName: \"kubernetes.io/projected/e57a17bb-0609-4f45-ac9a-af60af65cdd9-kube-api-access-wzr9g\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.695069 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-kubelet\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.695134 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.695169 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.695233 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-ovn\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.695473 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.695496 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.695544 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.695619 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:04.695577832 +0000 UTC m=+21.964226600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-systemd-units\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696338 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-slash\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696377 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-netns\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-env-overrides\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696527 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-ovn-kubernetes\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696612 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-node-log\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696640 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-netd\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696662 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-etc-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696684 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-bin\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696709 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-log-socket\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696727 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696753 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696784 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696805 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-var-lib-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovn-node-metrics-cert\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696848 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-systemd\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-config\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.696890 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-script-lib\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.697035 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.697057 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.697066 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.697094 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:04.697083854 +0000 UTC m=+21.965732622 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.697134 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.697162 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:04.697153117 +0000 UTC m=+21.965801885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.700729 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.721827 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.760894 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.782277 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.797887 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-var-lib-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovn-node-metrics-cert\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798406 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-systemd\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798426 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-config\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798448 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-script-lib\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798490 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzr9g\" (UniqueName: \"kubernetes.io/projected/e57a17bb-0609-4f45-ac9a-af60af65cdd9-kube-api-access-wzr9g\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-kubelet\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798596 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-ovn\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-systemd-units\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798644 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-slash\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-netns\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798689 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-env-overrides\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798713 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-ovn-kubernetes\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798745 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-node-log\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798767 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-netd\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798790 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-etc-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798813 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-bin\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-log-socket\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798908 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-log-socket\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798214 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-var-lib-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.798987 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799232 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-systemd-units\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799292 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-systemd\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799406 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-ovn-kubernetes\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799476 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-slash\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799507 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-netns\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-kubelet\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799892 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-netd\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-node-log\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799961 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799967 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-ovn\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-etc-openvswitch\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.800017 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-bin\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.799986 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-config\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.800236 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-env-overrides\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.800371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-script-lib\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.803894 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.811138 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovn-node-metrics-cert\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.836074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzr9g\" (UniqueName: \"kubernetes.io/projected/e57a17bb-0609-4f45-ac9a-af60af65cdd9-kube-api-access-wzr9g\") pod \"ovnkube-node-rpg27\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.870273 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.895834 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.911853 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.929263 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.945380 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.993988 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:02 crc kubenswrapper[4763]: E1201 09:15:02.994138 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.998060 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 01 09:15:02 crc kubenswrapper[4763]: I1201 09:15:02.998846 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.000583 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.001301 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.002364 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.002958 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.003651 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.005108 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.005912 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.007004 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.007621 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.008953 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.009523 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.010026 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.011100 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.011759 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.012963 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.013367 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.013954 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.015041 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.015548 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.016890 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.017484 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.018921 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.019678 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.019934 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.020759 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.023260 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.023894 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.025227 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.026267 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.027387 4763 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.027549 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.029499 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.030655 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.031137 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.032859 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.033600 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.034588 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.035320 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.036369 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.036969 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.037909 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.038514 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.039559 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.040023 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.040957 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.041537 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.042701 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.043313 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.044217 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.045055 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.047341 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.047983 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.048585 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.050048 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.073517 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.092904 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.116044 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.121194 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr552" event={"ID":"192e1ecd-fa1f-4227-a40c-4f7773682880","Type":"ContainerStarted","Data":"18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d"} Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.121256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr552" event={"ID":"192e1ecd-fa1f-4227-a40c-4f7773682880","Type":"ContainerStarted","Data":"766e1b739ac3e0e1b0bf19d6411111b46a4ca9389207cc4b798e40c63e98ca8f"} Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.127592 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94"} Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.127649 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af"} Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.127665 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"8fd0f482e20d9b070ac90d0ecbfd7a63b146fc3c24108710054fd7a4741898c9"} Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.135675 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tjks4" event={"ID":"8938414d-9ef1-478b-9633-e43890dd4540","Type":"ContainerStarted","Data":"ee5db5a0d7e5d51d008f6342ee8004acdac286492d2c246460971a755a446f40"} Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.140601 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" event={"ID":"47f7352d-5a70-4ded-93bf-875ac4531bff","Type":"ContainerStarted","Data":"1b6dfcb6ae5251cc6eb16ba168934ed8b446be2d4e10fa2a8197e320e5923633"} Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.151141 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.170473 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.192064 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.207816 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.226654 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.243027 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.264041 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.281078 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.297918 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.326472 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.358191 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.382853 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.407569 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.427264 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.453073 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.481507 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.498330 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.521593 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.549818 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.732147 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.733305 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:03 crc kubenswrapper[4763]: W1201 09:15:03.748974 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode57a17bb_0609_4f45_ac9a_af60af65cdd9.slice/crio-56930ca0aecb1732acbbc62e6b3478c1db76f269169a9eaebd00603d68907eb0 WatchSource:0}: Error finding container 56930ca0aecb1732acbbc62e6b3478c1db76f269169a9eaebd00603d68907eb0: Status 404 returned error can't find the container with id 56930ca0aecb1732acbbc62e6b3478c1db76f269169a9eaebd00603d68907eb0 Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.802511 4763 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.805346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.805381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.805391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.805514 4763 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.822415 4763 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.822690 4763 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.824048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.824087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.824100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.824116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.824142 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:03Z","lastTransitionTime":"2025-12-01T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:03 crc kubenswrapper[4763]: E1201 09:15:03.895920 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.900608 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.900642 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.900653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.900669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.900683 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:03Z","lastTransitionTime":"2025-12-01T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:03 crc kubenswrapper[4763]: E1201 09:15:03.927585 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.934506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.934547 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.934560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.934574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.934583 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:03Z","lastTransitionTime":"2025-12-01T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:03 crc kubenswrapper[4763]: E1201 09:15:03.950737 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.954475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.954511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.954521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.954540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.954551 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:03Z","lastTransitionTime":"2025-12-01T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:03 crc kubenswrapper[4763]: E1201 09:15:03.970679 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.974668 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.974708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.974722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.974739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.974749 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:03Z","lastTransitionTime":"2025-12-01T09:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.994139 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:03 crc kubenswrapper[4763]: I1201 09:15:03.994154 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:03 crc kubenswrapper[4763]: E1201 09:15:03.994306 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:03 crc kubenswrapper[4763]: E1201 09:15:03.994527 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.009831 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.009960 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.012377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.012441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.012474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.012500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.012515 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.115145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.115203 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.115216 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.115240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.115257 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.142735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tjks4" event={"ID":"8938414d-9ef1-478b-9633-e43890dd4540","Type":"ContainerStarted","Data":"e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.143995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"56930ca0aecb1732acbbc62e6b3478c1db76f269169a9eaebd00603d68907eb0"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.145236 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.147361 4763 generic.go:334] "Generic (PLEG): container finished" podID="47f7352d-5a70-4ded-93bf-875ac4531bff" containerID="bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21" exitCode=0 Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.147395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" event={"ID":"47f7352d-5a70-4ded-93bf-875ac4531bff","Type":"ContainerDied","Data":"bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.170805 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.185582 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.217942 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.218634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.218679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.218692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.218710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.218721 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.234802 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.251753 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.270698 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.289395 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.307942 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.325856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.326272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.326285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.326304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.326320 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.326721 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.342221 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.359115 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.381030 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.397761 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.412319 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.429256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.429291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.429302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.429320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.429329 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.433718 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.446376 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.467271 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.481116 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.495281 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.518795 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.531024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.531056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.531066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.531079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.531088 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.534245 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.547391 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.559400 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.570855 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:04Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.615864 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.616059 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:15:08.615998525 +0000 UTC m=+25.884647293 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.616132 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.616331 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.616388 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:08.616375646 +0000 UTC m=+25.885024414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.633615 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.633662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.633671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.633688 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.633698 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.717438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.717532 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.717558 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717608 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717625 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717631 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717647 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717690 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:08.717671701 +0000 UTC m=+25.986320469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717706 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:08.717699382 +0000 UTC m=+25.986348150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717741 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717779 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.717792 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.718789 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:08.71871022 +0000 UTC m=+25.987358998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.735750 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.735779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.735787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.735802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.735810 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.838352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.838407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.838418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.838434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.838446 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.941416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.941487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.941500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.941518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.941531 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:04Z","lastTransitionTime":"2025-12-01T09:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:04 crc kubenswrapper[4763]: I1201 09:15:04.993971 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:04 crc kubenswrapper[4763]: E1201 09:15:04.994108 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.044104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.044191 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.044201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.044217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.044225 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.059714 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.063529 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.069947 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.075010 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.090107 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.100996 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.110179 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.120321 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.131241 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.141602 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.146754 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.146785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.146804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.146822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.146833 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.151496 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1" exitCode=0 Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.151588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.153858 4763 generic.go:334] "Generic (PLEG): container finished" podID="47f7352d-5a70-4ded-93bf-875ac4531bff" containerID="d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617" exitCode=0 Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.153893 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" event={"ID":"47f7352d-5a70-4ded-93bf-875ac4531bff","Type":"ContainerDied","Data":"d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.160346 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.180617 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.198707 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.211602 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.225255 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.243613 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.250971 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.251017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.251029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.251049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.251060 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.256939 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.269877 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.284363 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.303401 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.316320 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.330861 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.344942 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.353149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.353198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.353211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.353229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.353241 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.354743 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zrb77"] Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.355138 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.357144 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.357327 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.357470 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.362309 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.362547 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.380639 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.393183 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.407070 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.424965 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.429816 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-host\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.429851 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq99b\" (UniqueName: \"kubernetes.io/projected/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-kube-api-access-tq99b\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.429868 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-serviceca\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.439682 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.453694 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.455226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.455257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.455268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.455282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.455298 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.465862 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.475676 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.490126 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.507352 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.527017 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.531097 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-host\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.531135 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq99b\" (UniqueName: \"kubernetes.io/projected/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-kube-api-access-tq99b\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.531156 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-serviceca\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.531607 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-host\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.532505 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-serviceca\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.537619 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.551434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq99b\" (UniqueName: \"kubernetes.io/projected/bb8ad9e2-a93d-41cb-8014-296ebf0e7333-kube-api-access-tq99b\") pod \"node-ca-zrb77\" (UID: \"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\") " pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.553779 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.557438 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.557486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.557498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.557519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.557530 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.569832 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.581752 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.663601 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.663649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.663659 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.663677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.663689 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.671661 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.678698 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zrb77" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.698176 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.712219 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.765980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.766016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.766027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.766044 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.766055 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.868160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.868186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.868194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.868208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.868216 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.972818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.973349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.973362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.973381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.973392 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:05Z","lastTransitionTime":"2025-12-01T09:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.994009 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:05 crc kubenswrapper[4763]: I1201 09:15:05.994077 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:05 crc kubenswrapper[4763]: E1201 09:15:05.994150 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:05 crc kubenswrapper[4763]: E1201 09:15:05.994216 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.076679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.076720 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.076730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.076769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.076783 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.175758 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.175808 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.175822 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.175831 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.175841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.175851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.178435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.178484 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.178496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.178513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.178525 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.179384 4763 generic.go:334] "Generic (PLEG): container finished" podID="47f7352d-5a70-4ded-93bf-875ac4531bff" containerID="bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601" exitCode=0 Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.179438 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" event={"ID":"47f7352d-5a70-4ded-93bf-875ac4531bff","Type":"ContainerDied","Data":"bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.181729 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zrb77" event={"ID":"bb8ad9e2-a93d-41cb-8014-296ebf0e7333","Type":"ContainerStarted","Data":"34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.181769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zrb77" event={"ID":"bb8ad9e2-a93d-41cb-8014-296ebf0e7333","Type":"ContainerStarted","Data":"b4686bb302c69eadf00a05f32d4b050bff63e49b7474a7553a01897d25f28fe3"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.194652 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.207562 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.219877 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.229373 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.240801 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.256307 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.267679 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.281205 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.284912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.284955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.284970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.284988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.284997 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.304994 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.320286 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.342189 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.359144 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.372707 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.385396 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.387182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.387222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.387233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.387249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.387261 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.401514 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.413785 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.427759 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.441390 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.461019 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.473374 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.485901 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.489193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.489244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.489275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.489295 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.489306 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.500328 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.510121 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.522798 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.536903 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.555625 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.592392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.592436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.592445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.592477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.592487 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.595803 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.632690 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:06Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.694484 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.694537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.694548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.694565 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.694577 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.796923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.796972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.796985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.797001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.797011 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.899509 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.899576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.899592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.899610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.899621 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:06Z","lastTransitionTime":"2025-12-01T09:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:06 crc kubenswrapper[4763]: I1201 09:15:06.993914 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:06 crc kubenswrapper[4763]: E1201 09:15:06.994030 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.004742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.004803 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.004819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.004840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.004855 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.107109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.107151 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.107160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.107176 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.107187 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.189119 4763 generic.go:334] "Generic (PLEG): container finished" podID="47f7352d-5a70-4ded-93bf-875ac4531bff" containerID="1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895" exitCode=0 Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.189438 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" event={"ID":"47f7352d-5a70-4ded-93bf-875ac4531bff","Type":"ContainerDied","Data":"1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.212072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.212101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.212109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.212126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.212146 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.212694 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.230683 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.245063 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.261019 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.272922 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.290696 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.306746 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.314832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.314872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.314881 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.314899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.314911 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.321888 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.338819 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.350191 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.365200 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.390833 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.401212 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.411557 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:07Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.417300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.417333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.417343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.417359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.417368 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.519723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.519765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.519775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.519790 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.519801 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.621672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.621704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.621712 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.621726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.621734 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.724163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.724213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.724227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.724254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.724263 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.826697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.826739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.826750 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.826764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.826773 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.929018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.929064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.929075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.929092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.929103 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:07Z","lastTransitionTime":"2025-12-01T09:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.993757 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:07 crc kubenswrapper[4763]: I1201 09:15:07.993784 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:07 crc kubenswrapper[4763]: E1201 09:15:07.993881 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:07 crc kubenswrapper[4763]: E1201 09:15:07.993972 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.031099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.031144 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.031156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.031173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.031184 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.133596 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.133639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.133651 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.133669 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.133680 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.195316 4763 generic.go:334] "Generic (PLEG): container finished" podID="47f7352d-5a70-4ded-93bf-875ac4531bff" containerID="cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4" exitCode=0 Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.195368 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" event={"ID":"47f7352d-5a70-4ded-93bf-875ac4531bff","Type":"ContainerDied","Data":"cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.209578 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.221779 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.231753 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.236075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.236119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.236129 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.236147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.236157 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.247492 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.270145 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.281581 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.294272 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.305503 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.316244 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.328281 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.338558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.338814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.338889 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.338963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.339035 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.341932 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.353056 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.365881 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.381119 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:08Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.441590 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.441630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.441639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.441653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.441664 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.544791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.544840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.544851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.544868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.544879 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.647521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.647556 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.647568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.647582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.647591 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.664025 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.664120 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.664208 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:15:16.664176504 +0000 UTC m=+33.932825272 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.664219 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.665500 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:16.664786091 +0000 UTC m=+33.933434929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.749883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.749911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.749918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.749931 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.749939 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.766688 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.766731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.766763 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.766889 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.766883 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.766906 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.767020 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.766909 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.767067 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.767077 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.767000 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:16.766968861 +0000 UTC m=+34.035617699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.767119 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:16.767107094 +0000 UTC m=+34.035755862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.767135 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:16.767126915 +0000 UTC m=+34.035775803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.852567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.852600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.852610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.852625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.852637 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.954395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.954439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.954449 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.954488 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.954497 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:08Z","lastTransitionTime":"2025-12-01T09:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:08 crc kubenswrapper[4763]: I1201 09:15:08.993064 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:08 crc kubenswrapper[4763]: E1201 09:15:08.993185 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.057069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.057114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.057131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.057145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.057153 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.159844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.159888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.159902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.159923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.159946 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.204974 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.209298 4763 generic.go:334] "Generic (PLEG): container finished" podID="47f7352d-5a70-4ded-93bf-875ac4531bff" containerID="bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1" exitCode=0 Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.209341 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" event={"ID":"47f7352d-5a70-4ded-93bf-875ac4531bff","Type":"ContainerDied","Data":"bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.227049 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.243627 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.260399 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.262748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.262807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.262822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.262849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.262865 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.276611 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.291443 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.305551 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.326481 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.343628 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.359124 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.368064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.368102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.368114 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.368133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.368147 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.374878 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.389010 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.408204 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.418855 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.433091 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:09Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.470344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.470400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.470411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.470429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.470441 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.573415 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.573448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.573525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.573541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.573738 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.676737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.676765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.676774 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.676786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.676794 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.778956 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.778994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.779008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.779023 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.779035 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.881047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.881079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.881091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.881106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.881126 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.983584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.983623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.983635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.983651 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.983664 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:09Z","lastTransitionTime":"2025-12-01T09:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.993485 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:09 crc kubenswrapper[4763]: I1201 09:15:09.993518 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:09 crc kubenswrapper[4763]: E1201 09:15:09.993610 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:09 crc kubenswrapper[4763]: E1201 09:15:09.993807 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.086003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.086040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.086049 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.086065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.086074 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.188254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.188301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.188310 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.188333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.188343 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.216316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" event={"ID":"47f7352d-5a70-4ded-93bf-875ac4531bff","Type":"ContainerStarted","Data":"7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.233507 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.243310 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.254613 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.263922 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.275230 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.287962 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.290492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.290525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.290536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.290551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.290561 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.300473 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.312767 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.325828 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.335263 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.347747 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.368647 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.380389 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.393195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.393469 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.393555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.393628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.393683 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.395347 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:10Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.495725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.495804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.495817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.495838 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.495850 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.597998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.598046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.598059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.598078 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.598090 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.700301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.700334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.700343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.700355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.700364 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.802991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.803020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.803032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.803045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.803054 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.905764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.905808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.905818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.905833 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.905849 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:10Z","lastTransitionTime":"2025-12-01T09:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:10 crc kubenswrapper[4763]: I1201 09:15:10.993098 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:10 crc kubenswrapper[4763]: E1201 09:15:10.993248 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.008985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.009028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.009037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.009054 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.009066 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.111439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.111656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.111761 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.111846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.111930 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.214860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.215315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.215326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.215342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.215354 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.317915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.317967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.317978 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.317995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.318009 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.420647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.420692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.420704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.420725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.420738 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.523760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.523819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.523830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.523850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.523866 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.627198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.627242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.627261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.627281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.627293 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.730239 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.730278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.730288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.730305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.730315 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.837985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.838030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.838040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.838058 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.838074 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.941292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.941338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.941357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.941381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.941398 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:11Z","lastTransitionTime":"2025-12-01T09:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.993263 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:11 crc kubenswrapper[4763]: E1201 09:15:11.993415 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:11 crc kubenswrapper[4763]: I1201 09:15:11.993867 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:11 crc kubenswrapper[4763]: E1201 09:15:11.993942 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.043230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.043265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.043277 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.043295 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.043306 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.145992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.146059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.146082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.146109 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.146129 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.228148 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.228595 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.228635 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.228653 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.249525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.249582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.249605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.249666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.249694 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.254308 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.257512 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.260875 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.272243 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.295401 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.317255 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.330790 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.352512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.352548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.352560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.352576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.352587 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.356067 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.371011 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.387066 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.400199 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.416011 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.431097 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.442806 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.456413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.456497 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.456515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.456537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.456555 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.463019 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.475589 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.488969 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.503672 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.524392 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.536933 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.552701 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.558173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.558212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.558226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.558261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.558278 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.568361 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.583194 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.595565 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.608752 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.619263 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.636567 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.659900 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.660936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.661012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.661035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.661066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.661084 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.677287 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.693263 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:12Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.763541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.763604 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.763616 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.763653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.763665 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.865913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.865985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.866016 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.866038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.866050 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.968163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.968207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.968217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.968231 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.968240 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:12Z","lastTransitionTime":"2025-12-01T09:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:12 crc kubenswrapper[4763]: I1201 09:15:12.993473 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:12 crc kubenswrapper[4763]: E1201 09:15:12.993745 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.006076 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.025232 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.043594 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.070972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.071001 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.071011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.071023 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.071032 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.077608 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.131075 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.156609 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.172918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.172947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.172958 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.172972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.172983 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.177146 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.189159 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.198840 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.209164 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.221504 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.236934 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.248231 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.264142 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:13Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.274876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.274924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.274933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.274947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.274956 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.377487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.377538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.377551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.377566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.377576 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.479982 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.480256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.480334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.480411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.480485 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.582892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.582937 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.582945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.582961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.582971 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.685465 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.685506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.685519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.685542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.685579 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.787976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.788011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.788020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.788037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.788045 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.890504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.890553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.890567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.890584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.890596 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.992899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.992939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.992949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.992963 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.992972 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:13Z","lastTransitionTime":"2025-12-01T09:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.993012 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:13 crc kubenswrapper[4763]: E1201 09:15:13.993113 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:13 crc kubenswrapper[4763]: I1201 09:15:13.993721 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:13 crc kubenswrapper[4763]: E1201 09:15:13.993929 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.045925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.045986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.045999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.046017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.046030 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: E1201 09:15:14.057823 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.061754 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.061795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.061805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.061822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.061833 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: E1201 09:15:14.073407 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.077927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.077968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.077980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.077996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.078007 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: E1201 09:15:14.091686 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.095742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.095799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.095815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.095835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.095847 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: E1201 09:15:14.108428 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.113582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.113641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.113653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.113676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.113688 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: E1201 09:15:14.131778 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: E1201 09:15:14.131913 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.133888 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.133925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.133939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.133957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.133970 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.237048 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/0.log" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.237649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.237686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.237728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.237744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.237754 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.241108 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4" exitCode=1 Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.241248 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.242705 4763 scope.go:117] "RemoveContainer" containerID="e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.266537 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.282521 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.295606 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.310684 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.323788 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.337093 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.340073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.340099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.340108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.340121 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.340130 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.349696 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.361527 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.384893 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803642 5976 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803698 5976 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.803861 5976 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804046 5976 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804536 5976 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804957 5976 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:15:13.804986 5976 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:15:13.805005 5976 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:13.805015 5976 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:13.805032 5976 factory.go:656] Stopping watch factory\\\\nI1201 09:15:13.805034 5976 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:13.805044 5976 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.398415 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.410971 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.423553 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.435662 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.442351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.442384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.442397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.442414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.442426 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.453450 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.544896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.544928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.544938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.544954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.544963 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.591624 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk"] Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.592126 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.594156 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.594375 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.608781 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.626188 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.643726 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.647004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.647033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.647043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.647056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.647068 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.656763 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.672421 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.685688 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.699129 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.708230 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.721185 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.737580 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37dc3a40-ed8f-41fa-831c-fa08525f233c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.737616 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzvgl\" (UniqueName: \"kubernetes.io/projected/37dc3a40-ed8f-41fa-831c-fa08525f233c-kube-api-access-kzvgl\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.737649 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37dc3a40-ed8f-41fa-831c-fa08525f233c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.737674 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37dc3a40-ed8f-41fa-831c-fa08525f233c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.738076 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803642 5976 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803698 5976 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.803861 5976 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804046 5976 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804536 5976 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804957 5976 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:15:13.804986 5976 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:15:13.805005 5976 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:13.805015 5976 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:13.805032 5976 factory.go:656] Stopping watch factory\\\\nI1201 09:15:13.805034 5976 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:13.805044 5976 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.751212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.751246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.751263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.751280 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.751290 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.759227 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.771777 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.783184 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.793219 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.803856 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:14Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.839334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37dc3a40-ed8f-41fa-831c-fa08525f233c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.839373 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37dc3a40-ed8f-41fa-831c-fa08525f233c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.839417 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37dc3a40-ed8f-41fa-831c-fa08525f233c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.839445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzvgl\" (UniqueName: \"kubernetes.io/projected/37dc3a40-ed8f-41fa-831c-fa08525f233c-kube-api-access-kzvgl\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.840357 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37dc3a40-ed8f-41fa-831c-fa08525f233c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.840905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37dc3a40-ed8f-41fa-831c-fa08525f233c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.848992 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37dc3a40-ed8f-41fa-831c-fa08525f233c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.853839 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.853877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.853890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.853909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.853922 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.858520 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzvgl\" (UniqueName: \"kubernetes.io/projected/37dc3a40-ed8f-41fa-831c-fa08525f233c-kube-api-access-kzvgl\") pod \"ovnkube-control-plane-749d76644c-j97pk\" (UID: \"37dc3a40-ed8f-41fa-831c-fa08525f233c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.904822 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.958536 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.958581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.958591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.958605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.958616 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:14Z","lastTransitionTime":"2025-12-01T09:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:14 crc kubenswrapper[4763]: I1201 09:15:14.993477 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:14 crc kubenswrapper[4763]: E1201 09:15:14.993637 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.062901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.062955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.062967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.062987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.063000 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.166741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.166779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.166789 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.166806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.166818 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.247678 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/0.log" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.252947 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.253729 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.253897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" event={"ID":"37dc3a40-ed8f-41fa-831c-fa08525f233c","Type":"ContainerStarted","Data":"435784a5b40aa5b2f493732314fe0252b1f36c4860f82c6546e5e66a458185c4"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.271995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.272023 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.272033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.272048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.272058 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.297492 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.315074 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.330052 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.343835 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.355261 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.370027 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.374655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.374703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.374717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.374741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.374756 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.384299 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.396389 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.408838 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.423401 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.428017 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803642 5976 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803698 5976 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.803861 5976 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804046 5976 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804536 5976 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804957 5976 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:15:13.804986 5976 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:15:13.805005 5976 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:13.805015 5976 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:13.805032 5976 factory.go:656] Stopping watch factory\\\\nI1201 09:15:13.805034 5976 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:13.805044 5976 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.436983 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.448290 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.459855 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.469190 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.477508 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.477543 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.477553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.477574 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.477588 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.483077 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.495318 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.513725 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.530170 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.541837 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.555875 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.570050 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.580361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.580403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.580418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.580434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.580444 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.584550 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.603740 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803642 5976 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803698 5976 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.803861 5976 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804046 5976 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804536 5976 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804957 5976 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:15:13.804986 5976 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:15:13.805005 5976 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:13.805015 5976 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:13.805032 5976 factory.go:656] Stopping watch factory\\\\nI1201 09:15:13.805034 5976 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:13.805044 5976 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.613471 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.624849 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.636019 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.648765 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.660110 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.669893 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.679634 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:15Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.681907 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.681966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.681975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.681989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.681999 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.785404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.785563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.785594 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.785624 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.785645 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.888647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.888707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.888728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.888757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.888778 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.991630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.991883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.992156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.992730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.993114 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:15Z","lastTransitionTime":"2025-12-01T09:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.993081 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:15 crc kubenswrapper[4763]: E1201 09:15:15.994035 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:15 crc kubenswrapper[4763]: I1201 09:15:15.993078 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:15 crc kubenswrapper[4763]: E1201 09:15:15.997759 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.100897 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.100941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.100953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.100968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.100978 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.119530 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rtkzb"] Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.120279 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.120414 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.140179 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.160092 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.174381 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.185713 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.198397 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.203145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.203340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.203424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.203549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.203663 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.212012 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.227108 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.240394 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.252225 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn89v\" (UniqueName: \"kubernetes.io/projected/db50acd1-5694-49bc-9027-e96f7612e795-kube-api-access-kn89v\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.252581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.254616 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.260059 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/1.log" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.260874 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/0.log" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.264090 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679" exitCode=1 Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.264186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.264244 4763 scope.go:117] "RemoveContainer" containerID="e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.264901 4763 scope.go:117] "RemoveContainer" containerID="4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679" Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.265070 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.267140 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" event={"ID":"37dc3a40-ed8f-41fa-831c-fa08525f233c","Type":"ContainerStarted","Data":"a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.267269 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" event={"ID":"37dc3a40-ed8f-41fa-831c-fa08525f233c","Type":"ContainerStarted","Data":"739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.271287 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.285436 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.300664 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.306358 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.306394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.306406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.306423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.306434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.320425 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803642 5976 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803698 5976 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.803861 5976 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804046 5976 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804536 5976 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804957 5976 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:15:13.804986 5976 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:15:13.805005 5976 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:13.805015 5976 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:13.805032 5976 factory.go:656] Stopping watch factory\\\\nI1201 09:15:13.805034 5976 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:13.805044 5976 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.331579 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.347228 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.353847 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.353974 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn89v\" (UniqueName: \"kubernetes.io/projected/db50acd1-5694-49bc-9027-e96f7612e795-kube-api-access-kn89v\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.354103 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.354237 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs podName:db50acd1-5694-49bc-9027-e96f7612e795 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:16.8542034 +0000 UTC m=+34.122852358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs") pod "network-metrics-daemon-rtkzb" (UID: "db50acd1-5694-49bc-9027-e96f7612e795") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.365180 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.378197 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.380393 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn89v\" (UniqueName: \"kubernetes.io/projected/db50acd1-5694-49bc-9027-e96f7612e795-kube-api-access-kn89v\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.392337 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.405487 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.410087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.410341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.410424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.410520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.410595 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.418915 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.433895 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.452028 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e762fee35de6ee01144780f4cb6ccd5b6243083de515abdf2e6dec43045974b4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:13Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803642 5976 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1201 09:15:13.803698 5976 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.803861 5976 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804046 5976 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804536 5976 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1201 09:15:13.804957 5976 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:15:13.804986 5976 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:15:13.805005 5976 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:13.805015 5976 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:13.805032 5976 factory.go:656] Stopping watch factory\\\\nI1201 09:15:13.805034 5976 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:13.805044 5976 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"message\\\":\\\" removal\\\\nI1201 09:15:15.063670 6111 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:15.063702 6111 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:15.063838 6111 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.063928 6111 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 09:15:15.063961 6111 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:15:15.063997 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:15.064015 6111 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 09:15:15.064021 6111 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:15:15.064042 6111 factory.go:656] Stopping watch factory\\\\nI1201 09:15:15.064053 6111 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:15.064086 6111 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.064094 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:15:15.064102 6111 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:15:15.064111 6111 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:15:15.064120 6111 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 09:15:15.064187 6111 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.462713 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.473064 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.484562 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.493264 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.505628 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.513703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.513737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.513748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.513770 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.513784 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.517741 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.528446 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.537250 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.548720 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.558598 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:16Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.617437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.617508 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.617518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.617538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.617548 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.720805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.721060 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.721071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.721092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.721102 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.757823 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.757957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.757983 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:15:32.757956648 +0000 UTC m=+50.026605426 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.758078 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.758140 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:32.758127603 +0000 UTC m=+50.026776381 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.824391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.824448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.824461 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.824495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.824506 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.859533 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.859590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.859620 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.859645 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.859761 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.859785 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.859815 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.859821 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.859831 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.859936 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.859838 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs podName:db50acd1-5694-49bc-9027-e96f7612e795 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:17.859817559 +0000 UTC m=+35.128466327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs") pod "network-metrics-daemon-rtkzb" (UID: "db50acd1-5694-49bc-9027-e96f7612e795") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.859985 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.860007 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.860014 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:32.859985633 +0000 UTC m=+50.128634391 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.860037 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:32.860029775 +0000 UTC m=+50.128678543 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.860081 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:32.860060946 +0000 UTC m=+50.128709884 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.928312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.928354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.928366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.928383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.928395 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:16Z","lastTransitionTime":"2025-12-01T09:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:16 crc kubenswrapper[4763]: I1201 09:15:16.993612 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:16 crc kubenswrapper[4763]: E1201 09:15:16.993774 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.031849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.031896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.031906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.031924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.031937 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.134361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.134768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.134846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.134950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.135053 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.237113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.237148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.237156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.237171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.237181 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.272017 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/1.log" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.278638 4763 scope.go:117] "RemoveContainer" containerID="4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679" Dec 01 09:15:17 crc kubenswrapper[4763]: E1201 09:15:17.278800 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.292165 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.303777 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.315277 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.325785 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.334134 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.338944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.339150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.339365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.339586 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.339759 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.344939 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.355464 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.369010 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.382225 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.393332 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.403776 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.417934 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.437922 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"message\\\":\\\" removal\\\\nI1201 09:15:15.063670 6111 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:15.063702 6111 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:15.063838 6111 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.063928 6111 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 09:15:15.063961 6111 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:15:15.063997 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:15.064015 6111 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 09:15:15.064021 6111 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:15:15.064042 6111 factory.go:656] Stopping watch factory\\\\nI1201 09:15:15.064053 6111 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:15.064086 6111 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.064094 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:15:15.064102 6111 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:15:15.064111 6111 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:15:15.064120 6111 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 09:15:15.064187 6111 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.442819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.442868 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.442878 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.442890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.442900 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.450690 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.463298 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.476608 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:17Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.545304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.545337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.545347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.545362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.545374 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.647652 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.648003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.648253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.648513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.648700 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.751081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.751137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.751155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.751173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.751185 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.853512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.853557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.853568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.853584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.853596 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.868780 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:17 crc kubenswrapper[4763]: E1201 09:15:17.868960 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:17 crc kubenswrapper[4763]: E1201 09:15:17.869257 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs podName:db50acd1-5694-49bc-9027-e96f7612e795 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:19.869237127 +0000 UTC m=+37.137885895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs") pod "network-metrics-daemon-rtkzb" (UID: "db50acd1-5694-49bc-9027-e96f7612e795") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.956024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.956066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.956078 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.956093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.956105 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:17Z","lastTransitionTime":"2025-12-01T09:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.994038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.994133 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:17 crc kubenswrapper[4763]: I1201 09:15:17.994242 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:17 crc kubenswrapper[4763]: E1201 09:15:17.994277 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:17 crc kubenswrapper[4763]: E1201 09:15:17.994164 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:17 crc kubenswrapper[4763]: E1201 09:15:17.994391 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.058784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.059641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.059743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.059846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.059939 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.162190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.162235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.162246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.162262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.162273 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.264164 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.264201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.264215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.264230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.264240 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.366821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.366866 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.366876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.366900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.366911 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.469994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.470077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.470103 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.470126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.470144 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.572828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.572862 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.572876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.572895 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.572908 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.676128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.676169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.676179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.676196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.676210 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.778430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.778793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.778908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.778999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.779083 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.881902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.881939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.881948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.881966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.881978 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.984448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.984517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.984530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.984548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.984560 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:18Z","lastTransitionTime":"2025-12-01T09:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:18 crc kubenswrapper[4763]: I1201 09:15:18.993573 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:18 crc kubenswrapper[4763]: E1201 09:15:18.993692 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.088207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.088259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.088270 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.088288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.088300 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.191423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.191510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.191527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.191548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.191559 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.293627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.293679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.293689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.293709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.293721 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.414113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.414157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.414167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.414183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.414194 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.516957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.517007 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.517015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.517031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.517044 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.619175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.619221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.619257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.619275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.619286 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.722410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.722460 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.722505 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.722527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.722541 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.825288 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.825337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.825346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.825362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.825372 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.889244 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:19 crc kubenswrapper[4763]: E1201 09:15:19.889599 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:19 crc kubenswrapper[4763]: E1201 09:15:19.889715 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs podName:db50acd1-5694-49bc-9027-e96f7612e795 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:23.889683049 +0000 UTC m=+41.158331847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs") pod "network-metrics-daemon-rtkzb" (UID: "db50acd1-5694-49bc-9027-e96f7612e795") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.929438 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.929582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.929608 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.929641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.929661 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:19Z","lastTransitionTime":"2025-12-01T09:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.993659 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.993719 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:19 crc kubenswrapper[4763]: E1201 09:15:19.993849 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:19 crc kubenswrapper[4763]: I1201 09:15:19.993860 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:19 crc kubenswrapper[4763]: E1201 09:15:19.994031 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:19 crc kubenswrapper[4763]: E1201 09:15:19.994171 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.032641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.032709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.032725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.032772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.032787 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.135899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.135950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.135961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.135977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.135989 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.238198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.238254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.238263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.238279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.238287 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.340779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.340825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.340836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.340852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.340865 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.443266 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.443335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.443348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.443367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.443380 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.546184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.546224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.546234 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.546249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.546263 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.649080 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.649130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.649141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.649159 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.649170 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.751680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.751734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.751746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.751762 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.751773 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.854396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.854451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.854492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.854514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.854543 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.956804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.956850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.956861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.956879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.956892 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:20Z","lastTransitionTime":"2025-12-01T09:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:20 crc kubenswrapper[4763]: I1201 09:15:20.994106 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:20 crc kubenswrapper[4763]: E1201 09:15:20.994354 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.059359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.059398 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.059409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.059426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.059437 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.162553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.162591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.162601 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.162618 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.162632 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.265243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.265316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.265325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.265337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.265346 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.367528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.367568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.367577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.367592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.367603 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.475598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.475680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.475691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.475717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.475730 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.579424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.579583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.579611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.579653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.579682 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.683885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.683945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.683961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.683983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.683997 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.786781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.786828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.786839 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.786856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.786870 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.890824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.890901 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.890926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.890958 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.891002 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.993489 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:21 crc kubenswrapper[4763]: E1201 09:15:21.993629 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.993705 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:21 crc kubenswrapper[4763]: E1201 09:15:21.993758 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.993892 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:21 crc kubenswrapper[4763]: E1201 09:15:21.994038 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.994498 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.994551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.994565 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.994588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:21 crc kubenswrapper[4763]: I1201 09:15:21.994604 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:21Z","lastTransitionTime":"2025-12-01T09:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.098003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.098065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.098082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.098101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.098111 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.200819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.200895 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.200925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.200945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.200957 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.303396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.303447 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.303478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.303500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.303514 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.407181 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.407228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.407240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.407263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.407275 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.511175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.511214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.511224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.511244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.511257 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.615973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.616019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.616031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.616075 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.616088 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.719347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.719405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.719416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.719435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.719446 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.821694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.821733 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.821741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.821754 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.821763 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.923974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.924033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.924043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.924057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.924084 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:22Z","lastTransitionTime":"2025-12-01T09:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:22 crc kubenswrapper[4763]: I1201 09:15:22.993540 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:22 crc kubenswrapper[4763]: E1201 09:15:22.993678 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.008001 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.020924 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.025828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.025883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.025900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.025922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.025941 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.033566 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.050278 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.071178 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"message\\\":\\\" removal\\\\nI1201 09:15:15.063670 6111 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:15.063702 6111 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:15.063838 6111 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.063928 6111 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 09:15:15.063961 6111 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:15:15.063997 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:15.064015 6111 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 09:15:15.064021 6111 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:15:15.064042 6111 factory.go:656] Stopping watch factory\\\\nI1201 09:15:15.064053 6111 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:15.064086 6111 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.064094 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:15:15.064102 6111 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:15:15.064111 6111 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:15:15.064120 6111 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 09:15:15.064187 6111 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.081355 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.098637 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.114319 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.127764 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.128985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.129166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.129273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.129298 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.129309 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.138991 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.152174 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.165267 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.178577 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.196651 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.211457 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.223730 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:23Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.232559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.232606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.232618 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.232633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.232643 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.335076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.335110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.335119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.335133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.335143 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.437745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.437785 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.437795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.437810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.437823 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.540401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.540499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.540512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.540530 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.540542 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.643810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.643858 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.643869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.643891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.643906 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.746410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.746459 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.746468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.746506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.746523 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.849481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.849523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.849532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.849550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.849561 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.929637 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:23 crc kubenswrapper[4763]: E1201 09:15:23.929784 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:23 crc kubenswrapper[4763]: E1201 09:15:23.929855 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs podName:db50acd1-5694-49bc-9027-e96f7612e795 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:31.929837121 +0000 UTC m=+49.198485889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs") pod "network-metrics-daemon-rtkzb" (UID: "db50acd1-5694-49bc-9027-e96f7612e795") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.952934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.952978 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.952992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.953012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.953026 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:23Z","lastTransitionTime":"2025-12-01T09:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.993582 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:23 crc kubenswrapper[4763]: E1201 09:15:23.993708 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.994183 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:23 crc kubenswrapper[4763]: E1201 09:15:23.994364 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:23 crc kubenswrapper[4763]: I1201 09:15:23.994183 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:23 crc kubenswrapper[4763]: E1201 09:15:23.994516 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.055553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.055916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.055997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.056077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.056155 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.159587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.159627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.159635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.159654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.159665 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.262222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.262269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.262281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.262299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.262311 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.365009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.365100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.365108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.365128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.365139 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.467987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.468071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.468138 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.468165 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.468183 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.527380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.527423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.527435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.527452 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.527485 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: E1201 09:15:24.542378 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.547520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.547600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.547614 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.547632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.547647 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: E1201 09:15:24.563168 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.567226 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.567274 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.567286 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.567304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.567318 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: E1201 09:15:24.584796 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.589240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.589520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.589644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.589742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.589827 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: E1201 09:15:24.605847 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.609417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.609541 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.609553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.609568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.609577 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: E1201 09:15:24.622825 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:24Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:24 crc kubenswrapper[4763]: E1201 09:15:24.622943 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.624755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.624804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.624814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.624829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.624839 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.727249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.727307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.727319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.727335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.727347 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.829818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.829869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.829882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.829899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.829914 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.933641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.933673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.933681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.933695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.933706 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:24Z","lastTransitionTime":"2025-12-01T09:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:24 crc kubenswrapper[4763]: I1201 09:15:24.993550 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:24 crc kubenswrapper[4763]: E1201 09:15:24.993693 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.036342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.036386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.036396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.036409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.036419 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.139319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.139355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.139364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.139381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.139392 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.242553 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.242597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.242673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.242691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.242703 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.345564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.345603 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.345612 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.345626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.345636 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.448299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.448327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.448335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.448347 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.448355 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.550335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.550706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.550809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.550919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.551009 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.653572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.653617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.653628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.653645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.653655 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.757401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.757441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.757454 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.757487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.757498 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.860032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.860112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.860126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.860148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.860166 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.963710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.963788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.963800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.963842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.963859 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:25Z","lastTransitionTime":"2025-12-01T09:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.993522 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.993633 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:25 crc kubenswrapper[4763]: E1201 09:15:25.994024 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:25 crc kubenswrapper[4763]: I1201 09:15:25.993712 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:25 crc kubenswrapper[4763]: E1201 09:15:25.994380 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:25 crc kubenswrapper[4763]: E1201 09:15:25.994179 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.066828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.066881 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.066893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.066912 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.066924 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.169988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.170032 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.170042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.170061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.170075 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.272905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.272940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.272948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.272962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.272971 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.375477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.375514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.375526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.375540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.375585 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.478645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.478698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.478709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.478723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.478734 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.581271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.581314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.581348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.581378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.581389 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.683613 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.683655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.683666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.683682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.683693 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.786663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.786711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.786719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.786732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.786741 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.889641 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.889713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.889966 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.889991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.890002 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.993129 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:26 crc kubenswrapper[4763]: E1201 09:15:26.993264 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.993357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.993408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.993422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.993497 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:26 crc kubenswrapper[4763]: I1201 09:15:26.993518 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:26Z","lastTransitionTime":"2025-12-01T09:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.096682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.096738 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.096749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.096769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.096782 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.199760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.199797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.199811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.199829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.199840 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.302897 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.302973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.302987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.303005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.303045 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.405273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.405313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.405325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.405341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.405350 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.508366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.508412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.508424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.508440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.508470 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.611037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.611188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.611209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.611242 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.611260 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.714119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.714177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.714191 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.714211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.714225 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.817103 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.817146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.817157 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.817173 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.817183 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.920315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.920361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.920370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.920385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.920396 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:27Z","lastTransitionTime":"2025-12-01T09:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.993708 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.993724 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:27 crc kubenswrapper[4763]: E1201 09:15:27.993854 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:27 crc kubenswrapper[4763]: E1201 09:15:27.994020 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:27 crc kubenswrapper[4763]: I1201 09:15:27.993741 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:27 crc kubenswrapper[4763]: E1201 09:15:27.994753 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.022850 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.023170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.023235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.023314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.023398 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.126278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.126321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.126332 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.126348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.126360 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.228872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.228941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.228964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.228993 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.229015 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.332066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.332659 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.332812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.333010 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.333192 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.437420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.437481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.437495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.437511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.437523 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.540519 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.540560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.540570 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.540588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.540602 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.643261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.643646 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.643726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.643806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.643862 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.746841 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.746887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.747091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.747107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.747117 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.849739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.849772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.849781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.849798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.849816 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.952755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.953195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.953342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.953577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.953726 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:28Z","lastTransitionTime":"2025-12-01T09:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:28 crc kubenswrapper[4763]: I1201 09:15:28.993839 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:28 crc kubenswrapper[4763]: E1201 09:15:28.994062 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.057852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.058450 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.058950 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.059352 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.059676 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.163051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.163523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.163606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.163707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.163781 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.266737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.267227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.267390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.267577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.267807 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.370561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.370617 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.370631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.370648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.370658 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.473478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.473523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.473532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.473551 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.473560 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.576584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.576626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.576639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.576659 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.576672 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.679794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.679845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.679857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.679874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.679887 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.782229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.782582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.782663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.782758 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.782875 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.885328 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.885765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.885930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.886051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.886172 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.989313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.989354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.989365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.989382 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.989393 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:29Z","lastTransitionTime":"2025-12-01T09:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.993632 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:29 crc kubenswrapper[4763]: E1201 09:15:29.993749 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.993981 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.994095 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:29 crc kubenswrapper[4763]: E1201 09:15:29.994187 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:29 crc kubenswrapper[4763]: E1201 09:15:29.994310 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:29 crc kubenswrapper[4763]: I1201 09:15:29.994441 4763 scope.go:117] "RemoveContainer" containerID="4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.093097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.094686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.094791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.094899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.094998 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.202156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.202195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.202241 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.202261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.202276 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.304647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.304704 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.304716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.304730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.304759 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.334022 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/1.log" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.336203 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.337045 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.348955 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.363255 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.375646 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.388398 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.403380 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.406808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.406846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.406877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.406896 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.406908 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.418999 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.429606 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.445629 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.469524 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"message\\\":\\\" removal\\\\nI1201 09:15:15.063670 6111 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:15.063702 6111 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:15.063838 6111 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.063928 6111 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 09:15:15.063961 6111 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:15:15.063997 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:15.064015 6111 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 09:15:15.064021 6111 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:15:15.064042 6111 factory.go:656] Stopping watch factory\\\\nI1201 09:15:15.064053 6111 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:15.064086 6111 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.064094 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:15:15.064102 6111 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:15:15.064111 6111 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:15:15.064120 6111 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 09:15:15.064187 6111 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.485295 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.506514 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.508923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.508981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.508998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.509018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.509032 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.522814 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.535226 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.545394 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.555944 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.565907 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:30Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.612018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.612098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.612113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.612136 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.612148 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.714583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.714620 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.714631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.714644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.714653 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.816830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.816873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.816885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.816900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.816910 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.919035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.919086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.919099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.919120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.919132 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:30Z","lastTransitionTime":"2025-12-01T09:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:30 crc kubenswrapper[4763]: I1201 09:15:30.994174 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:30 crc kubenswrapper[4763]: E1201 09:15:30.994315 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.024700 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.024760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.024777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.024798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.024810 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.127110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.127142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.127150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.127163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.127172 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.229701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.229748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.229757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.229773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.229783 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.332400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.332444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.332477 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.332495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.332506 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.341390 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/2.log" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.342021 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/1.log" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.345952 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3" exitCode=1 Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.345994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.346032 4763 scope.go:117] "RemoveContainer" containerID="4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.346935 4763 scope.go:117] "RemoveContainer" containerID="360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3" Dec 01 09:15:31 crc kubenswrapper[4763]: E1201 09:15:31.347197 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.361615 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.373214 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.384586 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.396352 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.412563 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.431324 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4039834969e3bb76fa9e967878f97288b858b74d652fcf1f592f2c9a267b7679\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"message\\\":\\\" removal\\\\nI1201 09:15:15.063670 6111 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:15:15.063702 6111 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:15:15.063838 6111 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.063928 6111 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 09:15:15.063961 6111 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1201 09:15:15.063997 6111 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1201 09:15:15.064015 6111 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 09:15:15.064021 6111 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:15:15.064042 6111 factory.go:656] Stopping watch factory\\\\nI1201 09:15:15.064053 6111 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:15:15.064086 6111 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1201 09:15:15.064094 6111 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1201 09:15:15.064102 6111 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:15:15.064111 6111 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:15:15.064120 6111 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 09:15:15.064187 6111 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.435194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.435233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.435245 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.435261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.435272 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.442331 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.454840 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.467195 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.476524 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.490119 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.502095 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.512422 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.521887 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.533889 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.537804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.537836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.537846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.537860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.537869 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.544568 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:31Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.639935 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.640013 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.640048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.640077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.640094 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.743061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.743112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.743126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.743141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.743152 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.846262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.846308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.846322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.846339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.846352 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.948602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.948679 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.948706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.948734 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.948755 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:31Z","lastTransitionTime":"2025-12-01T09:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.992983 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.993049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:31 crc kubenswrapper[4763]: I1201 09:15:31.992986 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:31 crc kubenswrapper[4763]: E1201 09:15:31.993154 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:31 crc kubenswrapper[4763]: E1201 09:15:31.993264 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:31 crc kubenswrapper[4763]: E1201 09:15:31.993308 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.021921 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.022049 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.022105 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs podName:db50acd1-5694-49bc-9027-e96f7612e795 nodeName:}" failed. No retries permitted until 2025-12-01 09:15:48.022091304 +0000 UTC m=+65.290740072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs") pod "network-metrics-daemon-rtkzb" (UID: "db50acd1-5694-49bc-9027-e96f7612e795") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.052067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.052106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.052116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.052136 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.052147 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.154404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.154692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.154707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.154725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.154762 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.257914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.257985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.257998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.258017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.258338 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.351798 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/2.log" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.355879 4763 scope.go:117] "RemoveContainer" containerID="360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3" Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.356029 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.360151 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.360198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.360208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.360220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.360229 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.373686 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.386908 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.400496 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.413998 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.429252 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.446527 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.457704 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.462682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.462731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.462750 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.462771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.462786 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.473247 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.493605 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.504840 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.516702 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.528053 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.537810 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.553392 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.565523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.565572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.565582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.565598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.565608 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.565643 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.578166 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:32Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.668540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.668579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.668589 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.668605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.668616 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.771709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.771756 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.771771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.771788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.771799 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.828698 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.828872 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:16:04.82884363 +0000 UTC m=+82.097492398 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.829056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.829188 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.829242 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:16:04.829232302 +0000 UTC m=+82.097881150 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.874440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.874499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.874507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.874521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.874529 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.929560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.929641 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.929666 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929733 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929746 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929767 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929781 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929789 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929809 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929820 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929794 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:16:04.929777075 +0000 UTC m=+82.198425843 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929863 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:16:04.929852827 +0000 UTC m=+82.198501595 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.929876 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:16:04.929868948 +0000 UTC m=+82.198517716 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.977408 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.977451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.977487 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.977504 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.977514 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:32Z","lastTransitionTime":"2025-12-01T09:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:32 crc kubenswrapper[4763]: I1201 09:15:32.994058 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:32 crc kubenswrapper[4763]: E1201 09:15:32.994153 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.007219 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.019155 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.030030 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.041312 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.050393 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.062323 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.074005 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.079446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.079499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.079510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.079529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.079539 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.083837 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.105334 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.123733 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.137162 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.149158 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.163737 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.178613 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.182497 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.182568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.182579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.182595 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.182604 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.192652 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.205284 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:33Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.284981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.285413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.285424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.285439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.285490 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.388196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.388229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.388237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.388249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.388257 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.491316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.491366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.491379 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.491397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.491409 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.594047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.594081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.594090 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.594103 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.594112 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.697260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.697314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.697323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.697342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.697353 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.800232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.800292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.800301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.800315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.800340 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.902737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.902795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.902806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.902823 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.902833 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:33Z","lastTransitionTime":"2025-12-01T09:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.994001 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:33 crc kubenswrapper[4763]: E1201 09:15:33.994118 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.994192 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:33 crc kubenswrapper[4763]: E1201 09:15:33.994230 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:33 crc kubenswrapper[4763]: I1201 09:15:33.994267 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:33 crc kubenswrapper[4763]: E1201 09:15:33.994312 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.005172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.005208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.005216 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.005229 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.005239 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.107535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.107573 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.107583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.107598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.107607 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.210999 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.211040 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.211048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.211064 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.211074 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.313298 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.313337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.313345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.313359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.313367 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.416004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.416050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.416061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.416076 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.416086 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.521858 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.521890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.521900 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.521913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.521921 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.624812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.624846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.624855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.624870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.624879 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.727000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.727030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.727039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.727051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.727060 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.829513 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.829730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.829791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.829856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.829942 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.929798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.929834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.929842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.929857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.929868 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: E1201 09:15:34.947098 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:34Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.952326 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.952365 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.952380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.952400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.952413 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: E1201 09:15:34.970747 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:34Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.976154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.976375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.976555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.976719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.976882 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:34 crc kubenswrapper[4763]: E1201 09:15:34.991611 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:34Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.993731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:34 crc kubenswrapper[4763]: E1201 09:15:34.994026 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.995560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.995608 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.995625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.995644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:34 crc kubenswrapper[4763]: I1201 09:15:34.995658 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:34Z","lastTransitionTime":"2025-12-01T09:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: E1201 09:15:35.009323 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:35Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.013145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.013192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.013207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.013227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.013242 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: E1201 09:15:35.030426 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:35Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:35 crc kubenswrapper[4763]: E1201 09:15:35.030583 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.032313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.032349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.032359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.032373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.032383 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.135002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.135045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.135093 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.135112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.135125 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.237220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.237265 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.237276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.237294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.237306 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.339684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.339765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.339779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.339793 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.339809 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.442549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.442578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.442588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.442605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.442625 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.545535 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.545749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.545851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.545915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.545977 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.653130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.653191 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.653208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.653230 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.653247 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.755974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.756246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.756334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.756402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.756480 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.858947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.859194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.859304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.859496 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.859649 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.961749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.961802 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.961810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.961824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.961849 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:35Z","lastTransitionTime":"2025-12-01T09:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.993900 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.993966 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:35 crc kubenswrapper[4763]: E1201 09:15:35.994055 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:35 crc kubenswrapper[4763]: E1201 09:15:35.994109 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:35 crc kubenswrapper[4763]: I1201 09:15:35.994942 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:35 crc kubenswrapper[4763]: E1201 09:15:35.995258 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.064410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.064474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.064486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.064506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.064518 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.166397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.166423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.166433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.166483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.166494 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.268861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.268911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.268927 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.268945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.268961 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.371312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.371354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.371363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.371376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.371385 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.474626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.474676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.474692 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.474713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.474729 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.577304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.577379 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.577390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.577405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.577415 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.680115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.680158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.680169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.680184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.680196 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.784573 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.784616 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.784643 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.784660 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.784672 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.887977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.888015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.888026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.888041 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.888050 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.991169 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.991208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.991220 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.991236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.991247 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:36Z","lastTransitionTime":"2025-12-01T09:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:36 crc kubenswrapper[4763]: I1201 09:15:36.993652 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:36 crc kubenswrapper[4763]: E1201 09:15:36.993746 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.093260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.093320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.093334 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.093353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.093365 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.196086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.196120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.196133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.196154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.196168 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.298282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.298316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.298325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.298339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.298348 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.399949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.399982 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.399990 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.400003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.400012 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.502145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.502188 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.502200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.502216 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.502231 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.605933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.605994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.606003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.606033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.606043 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.708059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.708104 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.708113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.708142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.708160 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.810132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.810198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.810211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.810227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.810238 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.912778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.912822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.912832 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.912847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.912859 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:37Z","lastTransitionTime":"2025-12-01T09:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.993624 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.993778 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:37 crc kubenswrapper[4763]: E1201 09:15:37.993885 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:37 crc kubenswrapper[4763]: I1201 09:15:37.993968 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:37 crc kubenswrapper[4763]: E1201 09:15:37.994165 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:37 crc kubenswrapper[4763]: E1201 09:15:37.994798 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.015983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.016039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.016055 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.016074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.016085 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.118391 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.118425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.118437 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.118453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.118487 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.221145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.221325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.221343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.221372 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.221390 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.323743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.323789 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.323800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.323821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.323836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.426702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.426762 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.426779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.426806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.426822 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.529767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.529824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.529836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.529855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.529871 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.632425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.632475 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.632492 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.632506 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.632514 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.674199 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.682887 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.687810 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.698205 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.708968 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.719103 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.734146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.734197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.734213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.734233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.734248 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.740427 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.753741 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.777304 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.804336 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.814133 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.826430 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.836946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.837004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.837014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.837031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.837041 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.842111 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.853642 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.862886 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.873575 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.883218 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.892599 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:38Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.939238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.939285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.939293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.939309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.939319 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:38Z","lastTransitionTime":"2025-12-01T09:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:38 crc kubenswrapper[4763]: I1201 09:15:38.994141 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:38 crc kubenswrapper[4763]: E1201 09:15:38.994274 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.041709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.041763 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.041778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.041795 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.041807 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.144047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.144100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.144117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.144140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.144154 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.249932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.249987 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.249997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.250009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.250018 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.352745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.352843 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.352867 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.352891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.352908 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.454542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.454579 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.454594 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.454616 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.454627 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.557035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.557521 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.557705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.557845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.557969 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.660147 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.660434 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.660613 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.660753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.660840 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.764354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.764402 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.764413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.764431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.764444 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.866968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.867005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.867013 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.867028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.867037 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.969619 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.969651 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.969662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.969677 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.969687 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:39Z","lastTransitionTime":"2025-12-01T09:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.993292 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:39 crc kubenswrapper[4763]: E1201 09:15:39.993659 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.993310 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:39 crc kubenswrapper[4763]: E1201 09:15:39.993883 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:39 crc kubenswrapper[4763]: I1201 09:15:39.993292 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:39 crc kubenswrapper[4763]: E1201 09:15:39.994064 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.072047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.072081 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.072091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.072107 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.072117 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.174851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.174894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.174906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.174923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.174934 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.276851 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.276879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.276887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.276899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.276907 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.380014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.380392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.380691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.380994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.381192 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.485321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.485399 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.485422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.485451 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.485513 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.588171 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.588207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.588218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.588233 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.588243 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.690412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.690495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.690509 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.690524 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.690534 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.793422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.793525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.793544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.793571 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.793588 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.896335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.896517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.896534 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.896565 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.896580 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:40Z","lastTransitionTime":"2025-12-01T09:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:40 crc kubenswrapper[4763]: I1201 09:15:40.993609 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:40 crc kubenswrapper[4763]: E1201 09:15:40.993745 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.000034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.000067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.000078 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.000095 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.000107 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.101818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.101859 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.101870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.101885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.101896 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.204959 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.205009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.205020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.205038 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.205085 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.308197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.308261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.308275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.308290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.308304 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.410299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.410362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.410373 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.410388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.410413 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.513212 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.513279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.513290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.513309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.513323 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.615746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.615808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.615820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.615839 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.615850 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.718730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.718804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.718819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.718870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.718890 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.824435 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.824979 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.824996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.825011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.825021 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.928518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.928582 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.928595 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.928614 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.928628 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:41Z","lastTransitionTime":"2025-12-01T09:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.993448 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.993541 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:41 crc kubenswrapper[4763]: E1201 09:15:41.993632 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:41 crc kubenswrapper[4763]: I1201 09:15:41.993659 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:41 crc kubenswrapper[4763]: E1201 09:15:41.993721 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:41 crc kubenswrapper[4763]: E1201 09:15:41.993796 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.030385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.030416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.030424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.030444 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.030471 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.133644 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.133691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.133701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.133718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.133731 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.236527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.236572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.236584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.236597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.236607 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.340158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.340217 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.340238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.340261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.340279 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.443594 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.443672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.443697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.443726 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.443751 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.547189 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.547257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.547279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.547308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.547328 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.651364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.651441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.651495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.651523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.651541 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.754272 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.754316 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.754327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.754345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.754358 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.857572 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.857717 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.857735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.857761 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.857780 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.960481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.960526 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.960539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.960560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.960573 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:42Z","lastTransitionTime":"2025-12-01T09:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:42 crc kubenswrapper[4763]: I1201 09:15:42.993396 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:42 crc kubenswrapper[4763]: E1201 09:15:42.993672 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.019245 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.034487 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.051576 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.063921 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.064427 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.064632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.064788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.064935 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.075128 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.087435 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.102919 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.119644 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.139268 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.154343 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.168899 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.169346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.169602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.169829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.169992 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.171415 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.188789 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.207759 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.223943 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"450c19db-6088-438e-a2ec-a657e1c918f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.238423 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.252135 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.264025 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.273367 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.273587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.273752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.273875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.273970 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.278507 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:43Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.376394 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.376433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.376440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.376468 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.376477 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.479762 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.479810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.479824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.479842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.479853 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.583775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.583860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.583877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.583909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.583923 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.686869 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.686915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.686925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.686946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.686959 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.790024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.790074 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.790088 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.790110 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.790125 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.894101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.894682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.894696 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.894723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.894735 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.993427 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:43 crc kubenswrapper[4763]: E1201 09:15:43.993638 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.993729 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:43 crc kubenswrapper[4763]: E1201 09:15:43.993798 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.993858 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:43 crc kubenswrapper[4763]: E1201 09:15:43.993919 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.997830 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.997884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.997893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.997920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:43 crc kubenswrapper[4763]: I1201 09:15:43.997932 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:43Z","lastTransitionTime":"2025-12-01T09:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.102484 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.102557 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.102567 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.102584 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.102594 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.204870 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.204921 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.204940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.204965 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.204982 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.307807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.307860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.307872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.307894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.307909 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.410781 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.410835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.410847 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.410867 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.410879 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.513836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.513879 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.513892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.513908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.513919 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.617775 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.617833 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.617845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.617864 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.618180 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.722073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.722495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.722597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.722764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.722890 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.825903 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.826179 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.826309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.826400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.826508 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.928958 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.929730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.929836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.929932 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.930006 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:44Z","lastTransitionTime":"2025-12-01T09:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:44 crc kubenswrapper[4763]: I1201 09:15:44.993559 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:44 crc kubenswrapper[4763]: E1201 09:15:44.993718 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.032483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.032702 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.032765 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.032834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.032907 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.111670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.111898 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.111972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.112031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.112095 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.124444 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.127559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.127740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.127808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.127882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.127942 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.141425 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.145539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.145609 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.145623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.145649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.145666 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.157868 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.162149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.162614 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.162732 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.162809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.162874 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.174306 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.178426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.178608 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.178693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.178760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.178829 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.194914 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:45Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.195619 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.197639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.197816 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.198011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.198262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.198428 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.302088 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.302539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.302737 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.302915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.303092 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.406259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.406544 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.406671 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.406757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.406840 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.509507 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.509555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.509570 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.509589 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.509604 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.612735 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.612995 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.613128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.613246 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.613361 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.715778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.715817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.715828 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.715844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.715854 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.819302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.819358 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.819374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.819397 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.819413 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.922314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.922598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.922691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.922774 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.922852 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:45Z","lastTransitionTime":"2025-12-01T09:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.993346 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.993452 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.993523 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.993563 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.993644 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.993820 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:45 crc kubenswrapper[4763]: I1201 09:15:45.997232 4763 scope.go:117] "RemoveContainer" containerID="360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3" Dec 01 09:15:45 crc kubenswrapper[4763]: E1201 09:15:45.997972 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.025403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.025448 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.025479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.025499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.025512 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.129183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.129269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.129292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.129320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.129346 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.233686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.233748 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.233771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.233801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.233820 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.336600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.336659 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.336678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.336701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.336718 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.439192 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.439417 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.439493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.439575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.439662 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.542709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.542773 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.542794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.542819 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.542837 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.647810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.648131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.648261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.648374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.648499 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.750363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.750404 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.750414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.750433 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.750448 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.852880 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.852934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.852947 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.852968 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.852982 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.955699 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.956039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.956126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.956208 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.956289 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:46Z","lastTransitionTime":"2025-12-01T09:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:46 crc kubenswrapper[4763]: I1201 09:15:46.993207 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:46 crc kubenswrapper[4763]: E1201 09:15:46.993397 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.058611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.058661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.058672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.058689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.058702 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.161403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.161439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.161450 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.161488 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.161500 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.264146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.264201 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.264224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.264252 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.264272 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.366695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.366722 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.366730 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.366744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.366753 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.468924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.468964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.468972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.468986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.468995 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.571514 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.571550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.571561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.571577 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.571586 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.674051 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.674096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.674108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.674128 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.674139 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.775857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.775883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.775892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.775905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.775914 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.878066 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.878123 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.878132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.878150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.878162 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.980842 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.980881 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.980891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.980905 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.980914 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:47Z","lastTransitionTime":"2025-12-01T09:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.993287 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.993361 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:47 crc kubenswrapper[4763]: I1201 09:15:47.993311 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:47 crc kubenswrapper[4763]: E1201 09:15:47.993417 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:47 crc kubenswrapper[4763]: E1201 09:15:47.993548 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:47 crc kubenswrapper[4763]: E1201 09:15:47.993630 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.082936 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.082970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.082981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.082997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.083007 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.113208 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:48 crc kubenswrapper[4763]: E1201 09:15:48.113344 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:48 crc kubenswrapper[4763]: E1201 09:15:48.113394 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs podName:db50acd1-5694-49bc-9027-e96f7612e795 nodeName:}" failed. No retries permitted until 2025-12-01 09:16:20.113379511 +0000 UTC m=+97.382028279 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs") pod "network-metrics-daemon-rtkzb" (UID: "db50acd1-5694-49bc-9027-e96f7612e795") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.185695 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.185740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.185752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.185769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.185780 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.287840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.287906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.287925 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.287957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.287976 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.390283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.390315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.390324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.390339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.390349 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.492708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.492747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.492794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.492818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.492830 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.594913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.594948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.594957 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.594973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.594983 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.697205 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.697240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.697249 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.697261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.697269 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.799289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.799335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.799343 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.799359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.799368 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.901794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.901845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.901856 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.901874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.901885 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:48Z","lastTransitionTime":"2025-12-01T09:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:48 crc kubenswrapper[4763]: I1201 09:15:48.993090 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:48 crc kubenswrapper[4763]: E1201 09:15:48.993235 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.004909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.004970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.004984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.004998 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.005011 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.108149 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.108396 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.108407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.108424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.108435 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.211276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.211340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.211348 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.211364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.211375 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.313626 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.313689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.313699 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.313714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.313722 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.416027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.416077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.416085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.416097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.416106 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.518943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.518997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.519011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.519033 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.519045 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.621657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.621691 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.621699 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.621715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.621724 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.723986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.724030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.724042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.724059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.724071 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.826370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.826403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.826430 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.826445 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.826466 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.929045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.929092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.929105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.929122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.929133 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:49Z","lastTransitionTime":"2025-12-01T09:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.993142 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:49 crc kubenswrapper[4763]: E1201 09:15:49.993251 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.993343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:49 crc kubenswrapper[4763]: I1201 09:15:49.993346 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:49 crc kubenswrapper[4763]: E1201 09:15:49.993519 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:49 crc kubenswrapper[4763]: E1201 09:15:49.993538 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.031538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.031593 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.031608 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.031630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.031642 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.133647 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.133682 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.133697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.133721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.133730 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.236207 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.236260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.236275 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.236292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.236304 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.337908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.337953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.337962 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.337975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.337985 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.440371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.440403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.440412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.440425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.440434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.467757 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/0.log" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.467817 4763 generic.go:334] "Generic (PLEG): container finished" podID="192e1ecd-fa1f-4227-a40c-4f7773682880" containerID="18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d" exitCode=1 Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.467856 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr552" event={"ID":"192e1ecd-fa1f-4227-a40c-4f7773682880","Type":"ContainerDied","Data":"18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.468605 4763 scope.go:117] "RemoveContainer" containerID="18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.486064 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"450c19db-6088-438e-a2ec-a657e1c918f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.503046 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.517635 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.532249 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.544523 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.544573 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.544591 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.544609 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.544620 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.546635 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.560128 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.571887 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.587709 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.610102 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.625545 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.640918 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.647339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.647380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.647393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.647411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.647423 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.655904 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.666852 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.685544 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.704118 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.720531 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.733618 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:49Z\\\",\\\"message\\\":\\\"2025-12-01T09:15:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d\\\\n2025-12-01T09:15:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d to /host/opt/cni/bin/\\\\n2025-12-01T09:15:04Z [verbose] multus-daemon started\\\\n2025-12-01T09:15:04Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:15:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:50Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.749611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.749649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.749657 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.749672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.749684 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.851948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.852290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.852302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.852317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.852327 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.954767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.954804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.954814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.954829 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.954841 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:50Z","lastTransitionTime":"2025-12-01T09:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:50 crc kubenswrapper[4763]: I1201 09:15:50.993371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:50 crc kubenswrapper[4763]: E1201 09:15:50.993728 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.056757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.056789 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.056798 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.056812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.056822 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.159436 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.159479 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.159490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.159503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.159511 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.262154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.262500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.262596 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.262683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.262780 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.365335 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.365370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.365380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.365393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.365633 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.468667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.468707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.468725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.468743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.468755 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.471170 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/0.log" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.471216 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr552" event={"ID":"192e1ecd-fa1f-4227-a40c-4f7773682880","Type":"ContainerStarted","Data":"92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.483771 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.498027 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"450c19db-6088-438e-a2ec-a657e1c918f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.512432 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.527697 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.540937 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.558269 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.571237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.571263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.571276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.571292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.571304 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.573212 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.587357 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.601820 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.619085 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.628810 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.642824 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.657781 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.671488 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.673887 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.673949 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.673961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.673976 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.673987 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.685280 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.701076 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.715541 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:49Z\\\",\\\"message\\\":\\\"2025-12-01T09:15:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d\\\\n2025-12-01T09:15:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d to /host/opt/cni/bin/\\\\n2025-12-01T09:15:04Z [verbose] multus-daemon started\\\\n2025-12-01T09:15:04Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:15:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:51Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.776412 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.776457 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.776482 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.776495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.776504 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.879005 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.879043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.879053 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.879067 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.879076 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.980964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.980991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.981000 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.981012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.981021 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:51Z","lastTransitionTime":"2025-12-01T09:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.993272 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.993338 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:51 crc kubenswrapper[4763]: E1201 09:15:51.993391 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:51 crc kubenswrapper[4763]: I1201 09:15:51.993273 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:51 crc kubenswrapper[4763]: E1201 09:15:51.993495 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:51 crc kubenswrapper[4763]: E1201 09:15:51.993580 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.083037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.083094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.083103 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.083122 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.083135 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.185330 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.185370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.185383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.185399 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.185411 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.287923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.287961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.287972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.287986 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.287995 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.390240 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.390278 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.390287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.390303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.390312 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.491804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.491876 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.491893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.491911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.491921 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.594302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.594349 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.594362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.594395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.594407 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.696254 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.696291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.696302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.696317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.696328 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.798353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.798384 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.798393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.798406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.798414 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.900718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.900760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.900770 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.900783 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.900793 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:52Z","lastTransitionTime":"2025-12-01T09:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:52 crc kubenswrapper[4763]: I1201 09:15:52.993931 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:52 crc kubenswrapper[4763]: E1201 09:15:52.994087 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.007746 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.007787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.007797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.007765 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.007815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.007974 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.020945 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.044655 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.056832 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.068991 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.088989 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.101915 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.109886 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.109924 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.109951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.109967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.109978 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.113397 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.128746 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.145387 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.160997 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.173002 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:49Z\\\",\\\"message\\\":\\\"2025-12-01T09:15:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d\\\\n2025-12-01T09:15:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d to /host/opt/cni/bin/\\\\n2025-12-01T09:15:04Z [verbose] multus-daemon started\\\\n2025-12-01T09:15:04Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:15:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.183659 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"450c19db-6088-438e-a2ec-a657e1c918f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.196019 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.204751 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.212034 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.212057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.212065 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.212091 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.212101 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.213933 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.222595 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:53Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.315161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.315221 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.315237 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.315260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.315277 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.417080 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.417117 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.417132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.417152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.417167 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.519483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.519520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.519532 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.519548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.519559 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.622024 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.622073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.622084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.622105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.622124 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.724648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.724694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.724710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.724728 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.724740 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.827292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.827321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.827329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.827342 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.827350 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.929381 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.929410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.929421 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.929474 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.929487 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:53Z","lastTransitionTime":"2025-12-01T09:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.993509 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.993560 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:53 crc kubenswrapper[4763]: I1201 09:15:53.993679 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:53 crc kubenswrapper[4763]: E1201 09:15:53.993773 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:53 crc kubenswrapper[4763]: E1201 09:15:53.993889 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:53 crc kubenswrapper[4763]: E1201 09:15:53.994022 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.033339 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.033395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.033427 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.033453 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.033493 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.137827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.137872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.137884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.137922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.137936 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.240289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.240324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.240333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.240345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.240354 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.342621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.342658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.342666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.342680 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.342689 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.445059 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.445105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.445113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.445129 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.445139 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.546875 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.546921 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.546930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.546945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.546955 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.649070 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.649140 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.649153 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.649166 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.649175 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.752200 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.752228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.752236 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.752250 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.752259 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.854046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.854086 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.854097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.854113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.854123 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.956588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.956613 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.956621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.956634 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.956642 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:54Z","lastTransitionTime":"2025-12-01T09:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:54 crc kubenswrapper[4763]: I1201 09:15:54.993352 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:54 crc kubenswrapper[4763]: E1201 09:15:54.993661 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.059327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.059383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.059395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.059411 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.059423 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.161877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.161923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.161939 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.161972 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.161992 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.264263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.264317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.264329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.264345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.264357 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.366747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.366794 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.366806 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.366824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.366836 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.468576 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.468882 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.468975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.469069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.469163 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.571853 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.571892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.571902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.571917 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.571927 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.585414 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.585724 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.585973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.586194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.586315 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.598197 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.602077 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.602252 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.602482 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.602693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.602911 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.617068 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.620808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.620872 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.620894 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.620913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.620987 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.633962 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.645108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.645145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.645154 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.645168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.645177 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.656699 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.660723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.660753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.660766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.660784 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.660795 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.676406 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:15:55Z is after 2025-08-24T17:21:41Z" Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.676642 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.677969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.678014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.678028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.678046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.678059 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.780627 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.780689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.780709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.780736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.780753 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.883130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.883227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.883238 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.883252 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.883260 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.985587 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.985645 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.985661 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.985684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.985699 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:55Z","lastTransitionTime":"2025-12-01T09:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.993444 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.993519 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:55 crc kubenswrapper[4763]: I1201 09:15:55.993553 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.993653 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.993740 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:55 crc kubenswrapper[4763]: E1201 09:15:55.993842 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.088705 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.088755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.088768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.088787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.088799 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.191317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.191366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.191377 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.191395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.191407 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.293508 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.293542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.293554 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.293569 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.293578 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.395945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.395983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.395994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.396009 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.396018 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.498222 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.498279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.498300 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.498323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.498339 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.601099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.601148 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.601158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.601174 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.601186 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.702977 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.703011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.703020 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.703035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.703046 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.805327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.805364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.805378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.805392 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.805403 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.907083 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.907112 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.907120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.907133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.907142 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:56Z","lastTransitionTime":"2025-12-01T09:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:56 crc kubenswrapper[4763]: I1201 09:15:56.993736 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:56 crc kubenswrapper[4763]: E1201 09:15:56.993935 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.010361 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.010413 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.010431 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.010488 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.010517 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.112389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.112420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.112429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.112441 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.112467 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.214092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.214132 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.214141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.214155 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.214164 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.316769 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.316805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.316814 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.316831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.316840 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.419137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.419182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.419193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.419209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.419220 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.521268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.521308 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.521360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.521376 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.521385 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.624285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.624359 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.624371 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.624387 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.624430 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.727855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.727914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.727926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.727945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.727957 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.830718 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.830766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.830778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.830797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.830810 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.933195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.933244 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.933259 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.933279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.933293 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:57Z","lastTransitionTime":"2025-12-01T09:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.994012 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:57 crc kubenswrapper[4763]: E1201 09:15:57.994507 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.994589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:57 crc kubenswrapper[4763]: I1201 09:15:57.994710 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:57 crc kubenswrapper[4763]: E1201 09:15:57.995030 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:15:57 crc kubenswrapper[4763]: E1201 09:15:57.994815 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.035580 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.035812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.035890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.035960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.036066 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.138160 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.138195 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.138211 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.138227 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.138238 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.240621 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.240656 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.240666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.240684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.240694 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.343418 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.343446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.343482 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.343501 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.343511 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.445922 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.445980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.445997 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.446013 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.446025 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.548333 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.548380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.548405 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.548423 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.548434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.651340 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.651370 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.651400 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.651416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.651427 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.754528 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.754598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.754611 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.754628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.754641 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.856885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.857168 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.857305 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.857337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.857354 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.959602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.959855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.959953 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.960025 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.960084 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:58Z","lastTransitionTime":"2025-12-01T09:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.993371 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:15:58 crc kubenswrapper[4763]: E1201 09:15:58.993554 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:15:58 crc kubenswrapper[4763]: I1201 09:15:58.994300 4763 scope.go:117] "RemoveContainer" containerID="360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.062098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.062137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.062158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.062175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.062186 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.165092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.165134 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.165142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.165156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.165165 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.267263 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.267296 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.267306 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.267319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.267329 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.370253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.370303 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.370317 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.370338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.370353 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.472575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.472620 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.472633 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.472653 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.472670 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.576006 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.576289 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.576364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.576426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.576546 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.678982 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.679017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.679027 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.679042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.679052 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.781822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.781852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.781863 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.781877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.781886 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.888951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.888981 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.888989 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.889002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.889010 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.991500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.991539 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.991549 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.991565 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.991577 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:15:59Z","lastTransitionTime":"2025-12-01T09:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.993775 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.993796 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:15:59 crc kubenswrapper[4763]: E1201 09:15:59.993881 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:15:59 crc kubenswrapper[4763]: I1201 09:15:59.993791 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:15:59 crc kubenswrapper[4763]: E1201 09:15:59.993980 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:15:59 crc kubenswrapper[4763]: E1201 09:15:59.994059 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.093152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.093202 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.093213 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.093224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.093232 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.195311 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.195351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.195364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.195379 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.195390 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.297930 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.297967 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.297975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.297992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.298006 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.400269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.400323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.400341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.400354 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.400362 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.500983 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/2.log" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.502276 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.502338 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.502363 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.502393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.502417 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.504057 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.504352 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.514748 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.531383 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.544558 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.554738 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.576134 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.601618 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.605241 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.605282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.605297 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.605313 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.605327 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.616943 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.628375 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:49Z\\\",\\\"message\\\":\\\"2025-12-01T09:15:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d\\\\n2025-12-01T09:15:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d to /host/opt/cni/bin/\\\\n2025-12-01T09:15:04Z [verbose] multus-daemon started\\\\n2025-12-01T09:15:04Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:15:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.639756 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"450c19db-6088-438e-a2ec-a657e1c918f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.652134 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.663147 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.673729 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.684039 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.697553 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.707190 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.707341 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.707403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.707499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.707570 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.716282 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.737379 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.747627 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:00Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.811791 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.811845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.811857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.811873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.811884 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.914516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.914556 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.914566 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.914581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.914591 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:00Z","lastTransitionTime":"2025-12-01T09:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:00 crc kubenswrapper[4763]: I1201 09:16:00.993629 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:00 crc kubenswrapper[4763]: E1201 09:16:00.993783 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.016788 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.016827 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.016837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.016852 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.016868 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.119938 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.120015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.120039 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.120072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.120095 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.229743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.229796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.229810 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.229831 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.229843 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.332197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.332269 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.332290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.332323 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.332347 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.435877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.435941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.435960 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.435985 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.436006 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.510370 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/3.log" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.511553 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/2.log" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.517019 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3" exitCode=1 Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.517073 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.517121 4763 scope.go:117] "RemoveContainer" containerID="360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.518214 4763 scope.go:117] "RemoveContainer" containerID="de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3" Dec 01 09:16:01 crc kubenswrapper[4763]: E1201 09:16:01.518449 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.539710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.539805 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.539885 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.539914 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.539940 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.549508 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:49Z\\\",\\\"message\\\":\\\"2025-12-01T09:15:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d\\\\n2025-12-01T09:15:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d to /host/opt/cni/bin/\\\\n2025-12-01T09:15:04Z [verbose] multus-daemon started\\\\n2025-12-01T09:15:04Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:15:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.565999 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.578956 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.592768 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.607766 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.619645 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.633312 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"450c19db-6088-438e-a2ec-a657e1c918f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.644069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.644102 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.644113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.644129 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.644140 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.654062 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.677576 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.708285 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.722281 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.737344 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.746472 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.746517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.746529 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.746548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.746562 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.750631 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.765645 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.787262 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://360b45bf8d78ee478dd6cee65199dda835f58d0ce3eda990edfff2b36516eef3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:31Z\\\",\\\"message\\\":\\\"1ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1201 09:15:30.870832 6303 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver/check-endpoints]} name:Service_openshift-apiserver/check-endpoints_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.139:17698:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8efa4d1a-72f5-4dfa-9bc2-9d93ef11ecf2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1201 09:15:30.870888 6303 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:16:00Z\\\",\\\"message\\\":\\\"ing *factory.egressNode crc took: 1.321025ms\\\\nI1201 09:16:00.221386 6656 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 09:16:00.221411 6656 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 09:16:00.221410 6656 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:16:00.221422 6656 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:16:00.221441 6656 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:16:00.221481 6656 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 09:16:00.221489 6656 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:16:00.221513 6656 factory.go:656] Stopping watch factory\\\\nI1201 09:16:00.221526 6656 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:16:00.221532 6656 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:16:00.221537 6656 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:16:00.221597 6656 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 09:16:00.221653 6656 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 09:16:00.221679 6656 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:16:00.221697 6656 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 09:16:00.221746 6656 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.798840 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.809867 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:01Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.848873 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.848916 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.848926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.848945 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.848956 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.952130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.952184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.952197 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.952215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.952227 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:01Z","lastTransitionTime":"2025-12-01T09:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.992994 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.993061 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:01 crc kubenswrapper[4763]: I1201 09:16:01.993163 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:01 crc kubenswrapper[4763]: E1201 09:16:01.993249 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:01 crc kubenswrapper[4763]: E1201 09:16:01.993325 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:01 crc kubenswrapper[4763]: E1201 09:16:01.993542 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.054555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.054594 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.054607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.054628 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.054643 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.158560 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.158589 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.158597 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.158610 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.158618 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.260884 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.260913 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.260920 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.260933 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.260941 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.363124 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.363159 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.363170 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.363186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.363195 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.464961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.465007 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.465017 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.465030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.465039 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.521879 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/3.log" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.525515 4763 scope.go:117] "RemoveContainer" containerID="de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3" Dec 01 09:16:02 crc kubenswrapper[4763]: E1201 09:16:02.525836 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.544325 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.561808 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.566974 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.567019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.567031 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.567048 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.567059 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.580551 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.595327 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.613593 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.636403 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:16:00Z\\\",\\\"message\\\":\\\"ing *factory.egressNode crc took: 1.321025ms\\\\nI1201 09:16:00.221386 6656 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 09:16:00.221411 6656 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 09:16:00.221410 6656 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:16:00.221422 6656 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:16:00.221441 6656 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:16:00.221481 6656 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 09:16:00.221489 6656 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:16:00.221513 6656 factory.go:656] Stopping watch factory\\\\nI1201 09:16:00.221526 6656 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:16:00.221532 6656 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:16:00.221537 6656 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:16:00.221597 6656 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 09:16:00.221653 6656 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 09:16:00.221679 6656 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:16:00.221697 6656 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 09:16:00.221746 6656 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.648957 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.666208 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.669893 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.669948 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.669964 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.669983 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.669997 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.683425 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.694264 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.708566 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.724328 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:49Z\\\",\\\"message\\\":\\\"2025-12-01T09:15:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d\\\\n2025-12-01T09:15:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d to /host/opt/cni/bin/\\\\n2025-12-01T09:15:04Z [verbose] multus-daemon started\\\\n2025-12-01T09:15:04Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:15:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.736031 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.755787 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.773600 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.773642 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.773655 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.773675 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.773691 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.774788 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"450c19db-6088-438e-a2ec-a657e1c918f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.786654 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.798045 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:02Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.877500 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.877552 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.877563 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.877581 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.877593 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.980623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.980697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.980714 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.980745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.980763 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:02Z","lastTransitionTime":"2025-12-01T09:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:02 crc kubenswrapper[4763]: I1201 09:16:02.994416 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:02 crc kubenswrapper[4763]: E1201 09:16:02.994751 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.010137 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f95ef452-7057-4afb-a8ca-1c505b953c2e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12300cde7fe0ec1bdc18e456c2b994fe09f5365d69a6fbbcf20c1722b1dbc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bnpt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l5kgb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.025223 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37dc3a40-ed8f-41fa-831c-fa08525f233c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://739af637dc1fecb27f4a640fb807e956beeca0ad318ccfc5ca2693be8d5b319e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f43285989fdb42d9f6eefb67a06e0a93e2e5226da47efbb02cc4353b02d90c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kzvgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:14Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-j97pk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.040413 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db50acd1-5694-49bc-9027-e96f7612e795\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kn89v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rtkzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.053163 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"450c19db-6088-438e-a2ec-a657e1c918f0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80e365955682b87fee8c04fccd4b61c834516977616fb5de68ae84b29ce97bc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61e3a08cd25ac19d38e2938d72bd8f88a00a7a6eb82d3a4b31690e991b43723f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ae933b5459010c2a082a1f2d4e5a0a056ad25ea7ffc3ba83a7f36fa5141a4ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83e3e0edbd8c0906dfdb4c4c7ae07d5938259090e5d87a589ba571aa1c95d97f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.070193 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.083378 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.083478 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.083494 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.083515 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.083528 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.086379 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3e29aa4a88575ca4940a1bf47214797e6afc53f9aa79e79ad5d33e86057dd42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.111586 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.126012 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.141520 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b4f931e7d3f9178e3962c6bc3e89008f41443fde9db1f1b35ba0b37def5b151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af3968e6633fee5046582e5fd0792e1ad9c2c4778a223bf0a17ce11c7ac50f6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.152263 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-tjks4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8938414d-9ef1-478b-9633-e43890dd4540\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5c78a5dbad1b1a053f93654f4622fe464340802c906bd20e8b7326f00caa89d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptlx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-tjks4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.178748 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47f7352d-5a70-4ded-93bf-875ac4531bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a35631bcff69cdcf338e6bb299d7e61d9c3ca54874256c88288631b154efac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bee11a7cb7b05420c0d5746f0d3fb4b99f3620b3d3782ebd567a752132079d21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d70d0f02b89754081ae0793bdadbb092acacac77ca7566679a3cfe034e5ac617\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc1601e95d1a0291a85848d0fcb432fa011b34d154bdbc0498ee00a231ad5601\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b7875117a42be4292d24cfa6b8050a232399c668efb3eb7f1f3808f78e1a895\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cca3a8344ba981cb39500dc8b6d1f4cb9950cc561bac77c5e15e39b5ed3065e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbfca64ee8345f5427eeac5066562211f7692bc95e28d95a54fce424d9f02cf1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lmg68\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kcjjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.186087 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.186152 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.186175 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.186205 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.186227 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.211283 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e57a17bb-0609-4f45-ac9a-af60af65cdd9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:16:00Z\\\",\\\"message\\\":\\\"ing *factory.egressNode crc took: 1.321025ms\\\\nI1201 09:16:00.221386 6656 factory.go:1336] Added *v1.Node event handler 7\\\\nI1201 09:16:00.221411 6656 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1201 09:16:00.221410 6656 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1201 09:16:00.221422 6656 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1201 09:16:00.221441 6656 handler.go:208] Removed *v1.Node event handler 2\\\\nI1201 09:16:00.221481 6656 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1201 09:16:00.221489 6656 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1201 09:16:00.221513 6656 factory.go:656] Stopping watch factory\\\\nI1201 09:16:00.221526 6656 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1201 09:16:00.221532 6656 handler.go:208] Removed *v1.Node event handler 7\\\\nI1201 09:16:00.221537 6656 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1201 09:16:00.221597 6656 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1201 09:16:00.221653 6656 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1201 09:16:00.221679 6656 ovnkube.go:599] Stopped ovnkube\\\\nI1201 09:16:00.221697 6656 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1201 09:16:00.221746 6656 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:15:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wzr9g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rpg27\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.229790 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zrb77" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb8ad9e2-a93d-41cb-8014-296ebf0e7333\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d6e4bc34c3e5814cf03b34cd632eae118ffe0f818547cbb100e26348a4558e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tq99b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:05Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zrb77\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.243413 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27df505b-02dc-4b2d-a9ae-68595b36f69f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d54dc7f39037408a414b5690f838966e43b3c9abed7793b748d176ac367de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e26060abc9acf1d9c2caece3670ae249065aad1f584a77c8d01b4f8dc75459c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15efa2f28a027ab90cf154a6276fb32aceac845c02da1374d87e8677331a1f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.264134 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2227a154479f6b896f1ea8af87e079d7cb55a5c14fac8e66562ac61eea4a445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.279870 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-01T09:15:00Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1201 09:14:55.334900 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1201 09:14:55.336127 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2339480483/tls.crt::/tmp/serving-cert-2339480483/tls.key\\\\\\\"\\\\nI1201 09:15:00.757559 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1201 09:15:00.766281 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1201 09:15:00.766309 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1201 09:15:00.766640 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1201 09:15:00.766653 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1201 09:15:00.787345 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1201 09:15:00.787390 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787396 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1201 09:15:00.787406 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1201 09:15:00.787409 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1201 09:15:00.787413 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1201 09:15:00.787417 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1201 09:15:00.787408 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1201 09:15:00.790046 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:14:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:14:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:14:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:14:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.288713 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.288779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.288804 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.288833 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.288856 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.296036 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-fr552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"192e1ecd-fa1f-4227-a40c-4f7773682880\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-01T09:15:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-01T09:15:49Z\\\",\\\"message\\\":\\\"2025-12-01T09:15:03+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d\\\\n2025-12-01T09:15:03+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_18859b50-c684-49c8-99ac-41087be5527d to /host/opt/cni/bin/\\\\n2025-12-01T09:15:04Z [verbose] multus-daemon started\\\\n2025-12-01T09:15:04Z [verbose] Readiness Indicator file check\\\\n2025-12-01T09:15:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-01T09:15:02Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:15:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xwhc7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-01T09:15:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-fr552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:03Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.391158 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.391199 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.391209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.391224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.391235 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.494098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.494146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.494156 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.494172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.494183 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.596243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.596525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.596602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.596689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.596775 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.699403 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.699481 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.699495 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.699511 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.699520 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.802199 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.802575 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.802588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.802606 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.802617 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.906014 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.906312 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.906409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.906559 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.906654 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:03Z","lastTransitionTime":"2025-12-01T09:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.993480 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.993547 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:03 crc kubenswrapper[4763]: I1201 09:16:03.993573 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:03 crc kubenswrapper[4763]: E1201 09:16:03.993635 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:03 crc kubenswrapper[4763]: E1201 09:16:03.993808 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:03 crc kubenswrapper[4763]: E1201 09:16:03.994576 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.009548 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.009585 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.009598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.009614 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.009626 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.112681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.112740 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.112751 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.112768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.112778 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.214970 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.215018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.215035 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.215057 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.215074 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.319028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.319105 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.319118 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.319142 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.319174 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.421072 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.421106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.421116 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.421130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.421141 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.524182 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.524251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.524273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.524301 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.524320 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.628319 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.628364 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.628380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.628401 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.628417 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.731194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.731241 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.731257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.731281 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.731297 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.834844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.834919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.834952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.835029 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.835071 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.883260 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.883406 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.883382238 +0000 UTC m=+146.152031006 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.883563 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.883713 4763 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.883775 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.883759977 +0000 UTC m=+146.152408735 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.938085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.938145 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.938162 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.938184 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.938200 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:04Z","lastTransitionTime":"2025-12-01T09:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.984415 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.984451 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.984496 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984608 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984626 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984638 4763 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984640 4763 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984697 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.984678055 +0000 UTC m=+146.253326843 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984736 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.984711956 +0000 UTC m=+146.253360734 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984649 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984767 4763 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984782 4763 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.984815 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.984806669 +0000 UTC m=+146.253455457 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 01 09:16:04 crc kubenswrapper[4763]: I1201 09:16:04.993713 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:04 crc kubenswrapper[4763]: E1201 09:16:04.993842 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.004023 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.040443 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.040520 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.040533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.040550 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.040561 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.143216 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.143253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.143261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.143277 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.143287 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.247071 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.247119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.247131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.247146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.247157 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.350293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.350345 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.350357 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.350386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.350400 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.459815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.459889 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.459902 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.459926 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.459942 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.563186 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.563253 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.563271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.563294 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.563311 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.667042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.667084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.667092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.667106 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.667117 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.765612 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.765723 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.765742 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.765768 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.765785 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.782066 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.788047 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.788096 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.788108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.788126 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.788139 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.803581 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.817388 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.817442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.817493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.817518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.817536 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.837699 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.842605 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.842666 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.842678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.842701 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.842713 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.857298 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.862561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.862622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.862635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.862658 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.862672 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.879075 4763 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-01T09:16:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c0bd43ec-2730-494c-91aa-feba284cbe79\\\",\\\"systemUUID\\\":\\\"0f5eae23-6db1-423b-9ba3-36ae34520ea2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-01T09:16:05Z is after 2025-08-24T17:21:41Z" Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.879378 4763 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.881662 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.881766 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.881837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.881906 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.881967 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.984568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.984620 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.984632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.984646 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.984656 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:05Z","lastTransitionTime":"2025-12-01T09:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.993095 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.993094 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.993236 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.993261 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:05 crc kubenswrapper[4763]: I1201 09:16:05.993120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:05 crc kubenswrapper[4763]: E1201 09:16:05.993378 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.087185 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.087232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.087243 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.087261 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.087272 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.189485 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.189517 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.189527 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.189540 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.189551 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.292127 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.292193 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.292205 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.292223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.292233 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.394860 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.394908 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.394918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.394934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.394946 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.498386 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.498420 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.498429 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.498442 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.498487 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.600881 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.600919 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.600928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.600943 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.600952 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.705892 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.705934 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.705946 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.705984 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.705995 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.808672 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.808716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.808727 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.808745 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.808758 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.911649 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.911710 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.911725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.911747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.911766 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:06Z","lastTransitionTime":"2025-12-01T09:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:06 crc kubenswrapper[4763]: I1201 09:16:06.993839 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:06 crc kubenswrapper[4763]: E1201 09:16:06.994306 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.014954 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.015026 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.015046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.016750 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.016791 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.022565 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.121426 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.121510 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.121522 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.121542 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.121555 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.224743 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.224787 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.224800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.224818 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.224830 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.328282 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.328353 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.328366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.328383 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.328394 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.431747 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.431786 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.431799 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.431822 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.431834 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.534707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.534809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.534825 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.534849 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.534866 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.638139 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.638214 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.638232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.638256 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.638269 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.740883 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.740918 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.740928 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.740944 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.740957 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.843307 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.843350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.843360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.843374 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.843384 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.946639 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.946698 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.946719 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.946739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.946751 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:07Z","lastTransitionTime":"2025-12-01T09:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.993948 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.993961 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:07 crc kubenswrapper[4763]: I1201 09:16:07.993961 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:07 crc kubenswrapper[4763]: E1201 09:16:07.994437 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:07 crc kubenswrapper[4763]: E1201 09:16:07.994883 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:07 crc kubenswrapper[4763]: E1201 09:16:07.994708 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.049251 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.049588 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.049687 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.049780 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.049865 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.152215 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.152257 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.152268 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.152283 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.152297 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.254315 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.254681 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.254760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.254840 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.254906 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.357940 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.358008 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.358018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.358037 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.358051 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.461664 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.461727 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.461744 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.461778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.461797 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.564910 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.564961 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.564975 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.564994 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.565007 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.667988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.668050 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.668063 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.668084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.668100 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.771366 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.771439 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.771486 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.771512 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.771526 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.874321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.874422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.874440 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.874490 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.874514 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.978085 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.978133 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.978146 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.978196 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.978210 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:08Z","lastTransitionTime":"2025-12-01T09:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:08 crc kubenswrapper[4763]: I1201 09:16:08.994039 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:08 crc kubenswrapper[4763]: E1201 09:16:08.994224 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.081224 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.081271 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.081287 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.081309 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.081324 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.184299 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.184350 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.184368 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.184389 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.184406 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.286262 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.286304 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.286314 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.286331 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.286343 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.388558 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.388599 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.388614 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.388630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.388641 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.491537 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.491592 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.491609 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.491632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.491648 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.594811 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.595163 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.595255 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.595291 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.595315 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.698630 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.698778 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.698817 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.698846 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.698867 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.802321 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.802749 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.802910 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.803120 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.803310 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.906979 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.907045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.907069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.907100 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.907123 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:09Z","lastTransitionTime":"2025-12-01T09:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.993148 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:09 crc kubenswrapper[4763]: E1201 09:16:09.993387 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.993170 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:09 crc kubenswrapper[4763]: E1201 09:16:09.993579 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:09 crc kubenswrapper[4763]: I1201 09:16:09.993148 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:09 crc kubenswrapper[4763]: E1201 09:16:09.993706 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.011011 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.011097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.011115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.011183 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.011200 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.114622 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.114709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.114736 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.114764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.114786 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.218631 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.218667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.218676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.218693 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.218703 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.322232 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.322322 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.322767 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.322848 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.322867 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.425973 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.426045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.426069 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.426097 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.426120 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.528836 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.528915 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.528952 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.528991 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.529015 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.631518 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.631568 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.631583 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.631604 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.631618 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.734646 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.734684 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.734694 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.734706 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.734714 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.837018 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.837084 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.837092 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.837108 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.837117 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.939344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.939390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.939406 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.939427 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.939441 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:10Z","lastTransitionTime":"2025-12-01T09:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:10 crc kubenswrapper[4763]: I1201 09:16:10.993863 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:10 crc kubenswrapper[4763]: E1201 09:16:10.994026 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.041955 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.042002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.042012 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.042028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.042039 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.144425 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.144480 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.144516 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.144533 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.144543 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.248835 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.248867 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.248877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.248891 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.248900 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.351683 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.351741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.351752 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.351772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.351786 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.454654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.454686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.454697 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.454712 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.454724 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.556640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.556703 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.556715 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.556738 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.556752 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.659344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.659483 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.659503 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.659525 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.659544 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.764351 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.764409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.764424 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.764471 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.764485 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.866929 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.866969 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.866980 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.866996 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.867007 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.969329 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.969379 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.969390 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.969407 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.969419 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:11Z","lastTransitionTime":"2025-12-01T09:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.993190 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.993252 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:11 crc kubenswrapper[4763]: E1201 09:16:11.993554 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:11 crc kubenswrapper[4763]: E1201 09:16:11.993627 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:11 crc kubenswrapper[4763]: I1201 09:16:11.993589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:11 crc kubenswrapper[4763]: E1201 09:16:11.993913 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.072951 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.073019 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.073030 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.073056 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.073075 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.175992 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.176061 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.176083 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.176115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.176138 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.279028 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.279079 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.279098 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.279119 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.279133 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.381689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.381755 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.381780 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.381808 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.381829 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.484493 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.484538 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.484555 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.484578 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.484595 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.587239 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.587285 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.587302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.587324 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.587377 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.690711 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.690771 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.690789 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.690812 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.690829 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.793564 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.793623 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.793640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.793667 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.793684 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.895815 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.895874 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.895890 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.895911 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.895926 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.993396 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:12 crc kubenswrapper[4763]: E1201 09:16:12.993574 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.997779 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.997807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.997820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.997834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:12 crc kubenswrapper[4763]: I1201 09:16:12.997845 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:12Z","lastTransitionTime":"2025-12-01T09:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.076450 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.076430962 podStartE2EDuration="1m8.076430962s" podCreationTimestamp="2025-12-01 09:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.076226626 +0000 UTC m=+90.344875394" watchObservedRunningTime="2025-12-01 09:16:13.076430962 +0000 UTC m=+90.345079730" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.099612 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.099988 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.100002 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.100015 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.100024 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.124106 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.124087148 podStartE2EDuration="6.124087148s" podCreationTimestamp="2025-12-01 09:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.106645581 +0000 UTC m=+90.375294349" watchObservedRunningTime="2025-12-01 09:16:13.124087148 +0000 UTC m=+90.392735916" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.136882 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tjks4" podStartSLOduration=72.136858221 podStartE2EDuration="1m12.136858221s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.13682141 +0000 UTC m=+90.405470178" watchObservedRunningTime="2025-12-01 09:16:13.136858221 +0000 UTC m=+90.405506989" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.156375 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kcjjj" podStartSLOduration=72.156358981 podStartE2EDuration="1m12.156358981s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.1559657 +0000 UTC m=+90.424614468" watchObservedRunningTime="2025-12-01 09:16:13.156358981 +0000 UTC m=+90.425007749" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.188352 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zrb77" podStartSLOduration=72.188336156 podStartE2EDuration="1m12.188336156s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.188234753 +0000 UTC m=+90.456883521" watchObservedRunningTime="2025-12-01 09:16:13.188336156 +0000 UTC m=+90.456984924" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.202150 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.202198 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.202210 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.202228 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.202240 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.207659 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.207645141 podStartE2EDuration="1m12.207645141s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.20760315 +0000 UTC m=+90.476251918" watchObservedRunningTime="2025-12-01 09:16:13.207645141 +0000 UTC m=+90.476293909" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.224966 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fr552" podStartSLOduration=72.224948584 podStartE2EDuration="1m12.224948584s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.223888136 +0000 UTC m=+90.492536904" watchObservedRunningTime="2025-12-01 09:16:13.224948584 +0000 UTC m=+90.493597362" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.254694 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.254675131 podStartE2EDuration="35.254675131s" podCreationTimestamp="2025-12-01 09:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.243528579 +0000 UTC m=+90.512177357" watchObservedRunningTime="2025-12-01 09:16:13.254675131 +0000 UTC m=+90.523323899" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.254803 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=9.254798064 podStartE2EDuration="9.254798064s" podCreationTimestamp="2025-12-01 09:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.254215689 +0000 UTC m=+90.522864457" watchObservedRunningTime="2025-12-01 09:16:13.254798064 +0000 UTC m=+90.523446832" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.293610 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-j97pk" podStartSLOduration=71.293593108 podStartE2EDuration="1m11.293593108s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.293214878 +0000 UTC m=+90.561863656" watchObservedRunningTime="2025-12-01 09:16:13.293593108 +0000 UTC m=+90.562241876" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.294124 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podStartSLOduration=72.294116822 podStartE2EDuration="1m12.294116822s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:13.282120808 +0000 UTC m=+90.550769576" watchObservedRunningTime="2025-12-01 09:16:13.294116822 +0000 UTC m=+90.562765590" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.305762 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.305797 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.305807 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.305820 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.305831 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.407716 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.407753 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.407764 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.407800 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.407812 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.509346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.509385 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.509395 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.509409 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.509421 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.611273 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.611327 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.611337 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.611355 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.611367 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.713635 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.713678 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.713690 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.713708 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.713721 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.816325 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.816393 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.816416 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.816446 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.816505 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.919172 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.919209 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.919218 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.919410 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.919420 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:13Z","lastTransitionTime":"2025-12-01T09:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.993886 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.993964 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:13 crc kubenswrapper[4763]: E1201 09:16:13.994006 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:13 crc kubenswrapper[4763]: I1201 09:16:13.994056 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:13 crc kubenswrapper[4763]: E1201 09:16:13.994091 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:13 crc kubenswrapper[4763]: E1201 09:16:13.994181 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.021863 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.021909 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.021923 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.021941 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.021951 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.124686 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.124731 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.124741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.124757 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.124768 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.227602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.227640 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.227654 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.227673 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.227687 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.330302 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.330346 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.330358 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.330375 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.330387 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.432663 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.432709 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.432721 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.432739 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.432751 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.535004 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.535260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.535320 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.535380 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.535434 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.637776 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.637824 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.637837 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.637855 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.637867 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.739772 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.739801 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.739809 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.739821 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.739830 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.843043 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.843099 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.843115 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.843137 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.843155 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.946042 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.946073 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.946082 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.946094 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.946104 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:14Z","lastTransitionTime":"2025-12-01T09:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.993389 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:14 crc kubenswrapper[4763]: E1201 09:16:14.993588 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:14 crc kubenswrapper[4763]: I1201 09:16:14.994843 4763 scope.go:117] "RemoveContainer" containerID="de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3" Dec 01 09:16:14 crc kubenswrapper[4763]: E1201 09:16:14.995003 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.048141 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.048279 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.048293 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.048311 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.048324 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.151161 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.151223 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.151235 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.151260 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.151275 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.254777 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.254844 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.254857 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.254877 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.254890 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.357045 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.357101 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.357113 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.357130 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.357143 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.459561 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.459602 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.459613 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.459632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.459645 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.561796 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.561834 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.561845 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.561861 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.561874 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.664003 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.664292 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.664360 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.664422 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.664499 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.766499 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.766792 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.767046 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.767247 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.767392 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.870670 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.870725 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.870741 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.870760 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.870772 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.973290 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.973331 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.973344 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.973362 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.973374 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:15Z","lastTransitionTime":"2025-12-01T09:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.993050 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.993077 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:15 crc kubenswrapper[4763]: E1201 09:16:15.993178 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:15 crc kubenswrapper[4763]: E1201 09:16:15.993306 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:15 crc kubenswrapper[4763]: I1201 09:16:15.993355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:15 crc kubenswrapper[4763]: E1201 09:16:15.993417 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.078625 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.078676 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.078689 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.078707 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.078889 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:16Z","lastTransitionTime":"2025-12-01T09:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.182554 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.182598 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.182607 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.182632 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.182644 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:16Z","lastTransitionTime":"2025-12-01T09:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.209131 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.209167 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.209177 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.209194 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.209206 4763 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-01T09:16:16Z","lastTransitionTime":"2025-12-01T09:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.248446 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs"] Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.248869 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.251948 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.252207 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.252524 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.252721 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.307028 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6745591-cc33-40ce-954f-436bc9791eda-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.307074 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6745591-cc33-40ce-954f-436bc9791eda-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.307131 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6745591-cc33-40ce-954f-436bc9791eda-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.307153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a6745591-cc33-40ce-954f-436bc9791eda-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.307178 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a6745591-cc33-40ce-954f-436bc9791eda-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.408507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6745591-cc33-40ce-954f-436bc9791eda-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.408554 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a6745591-cc33-40ce-954f-436bc9791eda-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.408587 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a6745591-cc33-40ce-954f-436bc9791eda-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.408627 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a6745591-cc33-40ce-954f-436bc9791eda-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.408634 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6745591-cc33-40ce-954f-436bc9791eda-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.408687 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6745591-cc33-40ce-954f-436bc9791eda-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.408721 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a6745591-cc33-40ce-954f-436bc9791eda-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.409764 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6745591-cc33-40ce-954f-436bc9791eda-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.419968 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6745591-cc33-40ce-954f-436bc9791eda-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.424139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6745591-cc33-40ce-954f-436bc9791eda-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lk4cs\" (UID: \"a6745591-cc33-40ce-954f-436bc9791eda\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.562608 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" Dec 01 09:16:16 crc kubenswrapper[4763]: I1201 09:16:16.993937 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:16 crc kubenswrapper[4763]: E1201 09:16:16.994084 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:17 crc kubenswrapper[4763]: I1201 09:16:17.571505 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" event={"ID":"a6745591-cc33-40ce-954f-436bc9791eda","Type":"ContainerStarted","Data":"fd6310d244a62558bf4286265e5f83e431af1345b56962589336877ef120ba66"} Dec 01 09:16:17 crc kubenswrapper[4763]: I1201 09:16:17.571547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" event={"ID":"a6745591-cc33-40ce-954f-436bc9791eda","Type":"ContainerStarted","Data":"4a6e28e2c30d9397c14c6aac019c95846451a87b694558addab7b639ce27abe5"} Dec 01 09:16:17 crc kubenswrapper[4763]: I1201 09:16:17.993915 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:17 crc kubenswrapper[4763]: I1201 09:16:17.993915 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:17 crc kubenswrapper[4763]: E1201 09:16:17.994049 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:17 crc kubenswrapper[4763]: E1201 09:16:17.994114 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:17 crc kubenswrapper[4763]: I1201 09:16:17.993915 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:17 crc kubenswrapper[4763]: E1201 09:16:17.994184 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:18 crc kubenswrapper[4763]: I1201 09:16:18.993354 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:18 crc kubenswrapper[4763]: E1201 09:16:18.993517 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:19 crc kubenswrapper[4763]: I1201 09:16:19.993941 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:19 crc kubenswrapper[4763]: E1201 09:16:19.994337 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:19 crc kubenswrapper[4763]: I1201 09:16:19.993983 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:19 crc kubenswrapper[4763]: I1201 09:16:19.993976 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:19 crc kubenswrapper[4763]: E1201 09:16:19.994411 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:19 crc kubenswrapper[4763]: E1201 09:16:19.994621 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:20 crc kubenswrapper[4763]: I1201 09:16:20.144718 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:20 crc kubenswrapper[4763]: E1201 09:16:20.144945 4763 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:16:20 crc kubenswrapper[4763]: E1201 09:16:20.145040 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs podName:db50acd1-5694-49bc-9027-e96f7612e795 nodeName:}" failed. No retries permitted until 2025-12-01 09:17:24.145017479 +0000 UTC m=+161.413666257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs") pod "network-metrics-daemon-rtkzb" (UID: "db50acd1-5694-49bc-9027-e96f7612e795") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 01 09:16:20 crc kubenswrapper[4763]: I1201 09:16:20.993967 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:20 crc kubenswrapper[4763]: E1201 09:16:20.994359 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:21 crc kubenswrapper[4763]: I1201 09:16:21.993953 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:21 crc kubenswrapper[4763]: I1201 09:16:21.993981 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:21 crc kubenswrapper[4763]: I1201 09:16:21.993964 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:21 crc kubenswrapper[4763]: E1201 09:16:21.994170 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:21 crc kubenswrapper[4763]: E1201 09:16:21.994078 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:21 crc kubenswrapper[4763]: E1201 09:16:21.994338 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:22 crc kubenswrapper[4763]: I1201 09:16:22.993626 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:22 crc kubenswrapper[4763]: E1201 09:16:22.994922 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:23 crc kubenswrapper[4763]: I1201 09:16:23.993259 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:23 crc kubenswrapper[4763]: I1201 09:16:23.993283 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:23 crc kubenswrapper[4763]: I1201 09:16:23.993283 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:23 crc kubenswrapper[4763]: E1201 09:16:23.993413 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:23 crc kubenswrapper[4763]: E1201 09:16:23.993545 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:23 crc kubenswrapper[4763]: E1201 09:16:23.993647 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:24 crc kubenswrapper[4763]: I1201 09:16:24.993110 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:24 crc kubenswrapper[4763]: E1201 09:16:24.993249 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:25 crc kubenswrapper[4763]: I1201 09:16:25.994008 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:25 crc kubenswrapper[4763]: I1201 09:16:25.994008 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:25 crc kubenswrapper[4763]: E1201 09:16:25.994156 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:25 crc kubenswrapper[4763]: E1201 09:16:25.994243 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:25 crc kubenswrapper[4763]: I1201 09:16:25.994030 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:25 crc kubenswrapper[4763]: E1201 09:16:25.994336 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:26 crc kubenswrapper[4763]: I1201 09:16:26.993805 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:26 crc kubenswrapper[4763]: E1201 09:16:26.994073 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:27 crc kubenswrapper[4763]: I1201 09:16:27.993972 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:27 crc kubenswrapper[4763]: E1201 09:16:27.994100 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:27 crc kubenswrapper[4763]: I1201 09:16:27.994285 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:27 crc kubenswrapper[4763]: E1201 09:16:27.994491 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:27 crc kubenswrapper[4763]: I1201 09:16:27.994291 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:27 crc kubenswrapper[4763]: E1201 09:16:27.994872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:28 crc kubenswrapper[4763]: I1201 09:16:28.994067 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:28 crc kubenswrapper[4763]: E1201 09:16:28.994262 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:29 crc kubenswrapper[4763]: I1201 09:16:29.993793 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:29 crc kubenswrapper[4763]: I1201 09:16:29.993838 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:29 crc kubenswrapper[4763]: I1201 09:16:29.993856 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:29 crc kubenswrapper[4763]: E1201 09:16:29.993946 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:29 crc kubenswrapper[4763]: E1201 09:16:29.994092 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:29 crc kubenswrapper[4763]: E1201 09:16:29.994539 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:29 crc kubenswrapper[4763]: I1201 09:16:29.994810 4763 scope.go:117] "RemoveContainer" containerID="de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3" Dec 01 09:16:29 crc kubenswrapper[4763]: E1201 09:16:29.994962 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rpg27_openshift-ovn-kubernetes(e57a17bb-0609-4f45-ac9a-af60af65cdd9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" Dec 01 09:16:30 crc kubenswrapper[4763]: I1201 09:16:30.993566 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:30 crc kubenswrapper[4763]: E1201 09:16:30.993749 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:31 crc kubenswrapper[4763]: I1201 09:16:31.993525 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:31 crc kubenswrapper[4763]: E1201 09:16:31.993722 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:31 crc kubenswrapper[4763]: I1201 09:16:31.993837 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:31 crc kubenswrapper[4763]: I1201 09:16:31.993899 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:31 crc kubenswrapper[4763]: E1201 09:16:31.994651 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:31 crc kubenswrapper[4763]: E1201 09:16:31.994950 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:32 crc kubenswrapper[4763]: I1201 09:16:32.993341 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:32 crc kubenswrapper[4763]: E1201 09:16:32.994273 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:33 crc kubenswrapper[4763]: I1201 09:16:33.993164 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:33 crc kubenswrapper[4763]: I1201 09:16:33.993174 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:33 crc kubenswrapper[4763]: E1201 09:16:33.993346 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:33 crc kubenswrapper[4763]: E1201 09:16:33.993593 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:33 crc kubenswrapper[4763]: I1201 09:16:33.994258 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:33 crc kubenswrapper[4763]: E1201 09:16:33.994705 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:34 crc kubenswrapper[4763]: I1201 09:16:34.994079 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:34 crc kubenswrapper[4763]: E1201 09:16:34.994237 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:35 crc kubenswrapper[4763]: I1201 09:16:35.993539 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:35 crc kubenswrapper[4763]: E1201 09:16:35.993672 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:35 crc kubenswrapper[4763]: I1201 09:16:35.993680 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:35 crc kubenswrapper[4763]: E1201 09:16:35.993955 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:35 crc kubenswrapper[4763]: I1201 09:16:35.994655 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:35 crc kubenswrapper[4763]: E1201 09:16:35.994915 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:36 crc kubenswrapper[4763]: I1201 09:16:36.629302 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/1.log" Dec 01 09:16:36 crc kubenswrapper[4763]: I1201 09:16:36.630026 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/0.log" Dec 01 09:16:36 crc kubenswrapper[4763]: I1201 09:16:36.630183 4763 generic.go:334] "Generic (PLEG): container finished" podID="192e1ecd-fa1f-4227-a40c-4f7773682880" containerID="92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705" exitCode=1 Dec 01 09:16:36 crc kubenswrapper[4763]: I1201 09:16:36.630287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr552" event={"ID":"192e1ecd-fa1f-4227-a40c-4f7773682880","Type":"ContainerDied","Data":"92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705"} Dec 01 09:16:36 crc kubenswrapper[4763]: I1201 09:16:36.630448 4763 scope.go:117] "RemoveContainer" containerID="18a88d1096283653fe5f5330f9249082dd26f31c85c96c948b059fdaddbde30d" Dec 01 09:16:36 crc kubenswrapper[4763]: I1201 09:16:36.630888 4763 scope.go:117] "RemoveContainer" containerID="92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705" Dec 01 09:16:36 crc kubenswrapper[4763]: E1201 09:16:36.631086 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-fr552_openshift-multus(192e1ecd-fa1f-4227-a40c-4f7773682880)\"" pod="openshift-multus/multus-fr552" podUID="192e1ecd-fa1f-4227-a40c-4f7773682880" Dec 01 09:16:36 crc kubenswrapper[4763]: I1201 09:16:36.654770 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lk4cs" podStartSLOduration=95.654754433 podStartE2EDuration="1m35.654754433s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:17.585567234 +0000 UTC m=+94.854216002" watchObservedRunningTime="2025-12-01 09:16:36.654754433 +0000 UTC m=+113.923403221" Dec 01 09:16:36 crc kubenswrapper[4763]: I1201 09:16:36.993786 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:36 crc kubenswrapper[4763]: E1201 09:16:36.993917 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:37 crc kubenswrapper[4763]: I1201 09:16:37.635716 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/1.log" Dec 01 09:16:37 crc kubenswrapper[4763]: I1201 09:16:37.994112 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:37 crc kubenswrapper[4763]: E1201 09:16:37.994439 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:37 crc kubenswrapper[4763]: I1201 09:16:37.994728 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:37 crc kubenswrapper[4763]: I1201 09:16:37.994787 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:37 crc kubenswrapper[4763]: E1201 09:16:37.994897 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:37 crc kubenswrapper[4763]: E1201 09:16:37.995073 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:38 crc kubenswrapper[4763]: I1201 09:16:38.994016 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:38 crc kubenswrapper[4763]: E1201 09:16:38.994131 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:39 crc kubenswrapper[4763]: I1201 09:16:39.993790 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:39 crc kubenswrapper[4763]: I1201 09:16:39.993796 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:39 crc kubenswrapper[4763]: E1201 09:16:39.994487 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:39 crc kubenswrapper[4763]: E1201 09:16:39.994580 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:39 crc kubenswrapper[4763]: I1201 09:16:39.993811 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:39 crc kubenswrapper[4763]: E1201 09:16:39.994820 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:40 crc kubenswrapper[4763]: I1201 09:16:40.993747 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:40 crc kubenswrapper[4763]: E1201 09:16:40.993965 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:41 crc kubenswrapper[4763]: I1201 09:16:41.993039 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:41 crc kubenswrapper[4763]: I1201 09:16:41.993070 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:41 crc kubenswrapper[4763]: I1201 09:16:41.993126 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:41 crc kubenswrapper[4763]: E1201 09:16:41.993285 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:41 crc kubenswrapper[4763]: E1201 09:16:41.993404 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:41 crc kubenswrapper[4763]: E1201 09:16:41.993565 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:42 crc kubenswrapper[4763]: E1201 09:16:42.947503 4763 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 01 09:16:42 crc kubenswrapper[4763]: I1201 09:16:42.994479 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:42 crc kubenswrapper[4763]: E1201 09:16:42.994558 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:43 crc kubenswrapper[4763]: E1201 09:16:43.105650 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 09:16:43 crc kubenswrapper[4763]: I1201 09:16:43.993577 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:43 crc kubenswrapper[4763]: I1201 09:16:43.993855 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:43 crc kubenswrapper[4763]: I1201 09:16:43.993864 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:43 crc kubenswrapper[4763]: E1201 09:16:43.993981 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:43 crc kubenswrapper[4763]: E1201 09:16:43.994158 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:43 crc kubenswrapper[4763]: I1201 09:16:43.994191 4763 scope.go:117] "RemoveContainer" containerID="de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3" Dec 01 09:16:43 crc kubenswrapper[4763]: E1201 09:16:43.994533 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:44 crc kubenswrapper[4763]: I1201 09:16:44.659869 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/3.log" Dec 01 09:16:44 crc kubenswrapper[4763]: I1201 09:16:44.662501 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerStarted","Data":"bb05ca306384e4013c5fc1b1f221725c94ed7d5bc1c9a6d8893fafd9ab0449df"} Dec 01 09:16:44 crc kubenswrapper[4763]: I1201 09:16:44.662963 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:16:44 crc kubenswrapper[4763]: I1201 09:16:44.690014 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podStartSLOduration=103.689999311 podStartE2EDuration="1m43.689999311s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:16:44.688527622 +0000 UTC m=+121.957176390" watchObservedRunningTime="2025-12-01 09:16:44.689999311 +0000 UTC m=+121.958648069" Dec 01 09:16:44 crc kubenswrapper[4763]: I1201 09:16:44.818026 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rtkzb"] Dec 01 09:16:44 crc kubenswrapper[4763]: I1201 09:16:44.818163 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:44 crc kubenswrapper[4763]: E1201 09:16:44.818304 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:44 crc kubenswrapper[4763]: I1201 09:16:44.995682 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:44 crc kubenswrapper[4763]: E1201 09:16:44.995784 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:45 crc kubenswrapper[4763]: I1201 09:16:45.993169 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:45 crc kubenswrapper[4763]: I1201 09:16:45.993222 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:45 crc kubenswrapper[4763]: E1201 09:16:45.993295 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:45 crc kubenswrapper[4763]: E1201 09:16:45.993369 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:46 crc kubenswrapper[4763]: I1201 09:16:46.994054 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:46 crc kubenswrapper[4763]: I1201 09:16:46.994119 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:46 crc kubenswrapper[4763]: E1201 09:16:46.994187 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:46 crc kubenswrapper[4763]: E1201 09:16:46.994268 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:47 crc kubenswrapper[4763]: I1201 09:16:47.993421 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:47 crc kubenswrapper[4763]: E1201 09:16:47.994000 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:47 crc kubenswrapper[4763]: I1201 09:16:47.993421 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:47 crc kubenswrapper[4763]: E1201 09:16:47.994207 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:47 crc kubenswrapper[4763]: I1201 09:16:47.994348 4763 scope.go:117] "RemoveContainer" containerID="92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705" Dec 01 09:16:48 crc kubenswrapper[4763]: E1201 09:16:48.107892 4763 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 01 09:16:48 crc kubenswrapper[4763]: I1201 09:16:48.676594 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/1.log" Dec 01 09:16:48 crc kubenswrapper[4763]: I1201 09:16:48.676684 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr552" event={"ID":"192e1ecd-fa1f-4227-a40c-4f7773682880","Type":"ContainerStarted","Data":"f7559b28f34a26d39e47f026311b6e84e5ea88b4fa9d864b01d93cf5a16b187e"} Dec 01 09:16:48 crc kubenswrapper[4763]: I1201 09:16:48.993773 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:48 crc kubenswrapper[4763]: I1201 09:16:48.993824 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:48 crc kubenswrapper[4763]: E1201 09:16:48.993920 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:48 crc kubenswrapper[4763]: E1201 09:16:48.994081 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:49 crc kubenswrapper[4763]: I1201 09:16:49.993440 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:49 crc kubenswrapper[4763]: I1201 09:16:49.993513 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:49 crc kubenswrapper[4763]: E1201 09:16:49.993623 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:49 crc kubenswrapper[4763]: E1201 09:16:49.993680 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:50 crc kubenswrapper[4763]: I1201 09:16:50.993852 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:50 crc kubenswrapper[4763]: I1201 09:16:50.993896 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:50 crc kubenswrapper[4763]: E1201 09:16:50.994020 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:50 crc kubenswrapper[4763]: E1201 09:16:50.994109 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:51 crc kubenswrapper[4763]: I1201 09:16:51.993556 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:51 crc kubenswrapper[4763]: E1201 09:16:51.993687 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 01 09:16:51 crc kubenswrapper[4763]: I1201 09:16:51.993571 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:51 crc kubenswrapper[4763]: E1201 09:16:51.993901 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 01 09:16:52 crc kubenswrapper[4763]: I1201 09:16:52.993731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:52 crc kubenswrapper[4763]: I1201 09:16:52.993751 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:52 crc kubenswrapper[4763]: E1201 09:16:52.996918 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 01 09:16:52 crc kubenswrapper[4763]: E1201 09:16:52.997294 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rtkzb" podUID="db50acd1-5694-49bc-9027-e96f7612e795" Dec 01 09:16:53 crc kubenswrapper[4763]: I1201 09:16:53.993802 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:16:53 crc kubenswrapper[4763]: I1201 09:16:53.993816 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:16:53 crc kubenswrapper[4763]: I1201 09:16:53.996502 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 09:16:54 crc kubenswrapper[4763]: I1201 09:16:53.996859 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 09:16:54 crc kubenswrapper[4763]: I1201 09:16:53.998354 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 09:16:54 crc kubenswrapper[4763]: I1201 09:16:53.998588 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 09:16:54 crc kubenswrapper[4763]: I1201 09:16:54.994561 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:16:54 crc kubenswrapper[4763]: I1201 09:16:54.994801 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:16:54 crc kubenswrapper[4763]: I1201 09:16:54.996828 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 09:16:54 crc kubenswrapper[4763]: I1201 09:16:54.997338 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.130648 4763 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.164606 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-56vb4"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.165272 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.165736 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.166331 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.171072 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.171156 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.176131 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.176132 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.176182 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.176303 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.177006 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.177368 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.177378 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.177542 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.177570 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.177651 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.177908 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.178041 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.178109 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.179254 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lk65c"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.179692 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lk65c" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.180559 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-458q5"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.180869 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.181781 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-548t8"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.182096 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.183045 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxfdf"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.183336 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.184473 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vcvq7"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.184873 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.186266 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qdpq8"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.187286 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.188188 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.189926 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.190679 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.190874 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.191347 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.191548 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.191732 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.191926 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.192068 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.192203 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.192351 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.192505 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.192782 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.193028 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.193197 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.197966 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.212889 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l4kcj"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.213517 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.214308 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.213641 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.213681 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.208955 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.214885 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.210923 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.211209 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.211280 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.217322 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.211693 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.217630 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.211727 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.211960 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.212034 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.212126 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.214794 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.214984 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215063 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215132 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215170 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215243 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215278 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215316 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215346 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215387 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.215821 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.236003 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.246270 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.246389 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.246823 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.246827 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbq9k\" (UniqueName: \"kubernetes.io/projected/940c2b58-e113-4dc3-8717-6d6be27a033d-kube-api-access-zbq9k\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-service-ca-bundle\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247326 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247404 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247488 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-console-config\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247540 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wf2br"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247689 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247799 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-serving-cert\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247866 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-serving-cert\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-config\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248005 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb7a73a-5d2e-4134-a0f1-d04e06492022-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248082 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/940c2b58-e113-4dc3-8717-6d6be27a033d-serving-cert\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248218 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247099 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248359 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-images\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248430 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-service-ca\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248034 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248561 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248517 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll2dk\" (UniqueName: \"kubernetes.io/projected/c683385a-b0c6-449e-bc11-b24c3824cb7d-kube-api-access-ll2dk\") pod \"dns-operator-744455d44c-vcvq7\" (UID: \"c683385a-b0c6-449e-bc11-b24c3824cb7d\") " pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248686 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248768 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248851 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-config\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248931 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-config\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248950 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247155 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.249017 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqj2h\" (UniqueName: \"kubernetes.io/projected/9bb7a73a-5d2e-4134-a0f1-d04e06492022-kube-api-access-cqj2h\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.249186 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247317 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.249211 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdbmf\" (UniqueName: \"kubernetes.io/projected/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-kube-api-access-tdbmf\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247431 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.249237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-config\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.249255 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-audit-policies\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247480 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.247552 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.246889 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248006 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248117 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248178 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.248214 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.249282 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.249866 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvctz\" (UniqueName: \"kubernetes.io/projected/f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4-kube-api-access-tvctz\") pod \"downloads-7954f5f757-lk65c\" (UID: \"f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4\") " pod="openshift-console/downloads-7954f5f757-lk65c" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.249941 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250011 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-trusted-ca\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250174 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-oauth-config\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250236 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-client-ca\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250314 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rx6v\" (UniqueName: \"kubernetes.io/projected/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-kube-api-access-2rx6v\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250406 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250504 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/53a538a2-4a8b-4524-aca4-5eff4f91cce5-audit-dir\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250579 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z982\" (UniqueName: \"kubernetes.io/projected/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-kube-api-access-5z982\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250650 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cknl4\" (UniqueName: \"kubernetes.io/projected/f0ccd14f-5d77-4541-860f-d834079cf97f-kube-api-access-cknl4\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250718 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/08a32dd3-b775-4153-a505-99b17e1637b1-audit-dir\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250782 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250868 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-trusted-ca-bundle\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.250934 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251005 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-encryption-config\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251181 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb7a73a-5d2e-4134-a0f1-d04e06492022-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251256 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-audit-policies\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251328 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-etcd-client\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251398 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251487 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-oauth-serving-cert\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c683385a-b0c6-449e-bc11-b24c3824cb7d-metrics-tls\") pod \"dns-operator-744455d44c-vcvq7\" (UID: \"c683385a-b0c6-449e-bc11-b24c3824cb7d\") " pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251651 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg695\" (UniqueName: \"kubernetes.io/projected/08a32dd3-b775-4153-a505-99b17e1637b1-kube-api-access-cg695\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251785 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xslb\" (UniqueName: \"kubernetes.io/projected/53a538a2-4a8b-4524-aca4-5eff4f91cce5-kube-api-access-5xslb\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-serving-cert\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251945 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-serving-cert\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251337 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251488 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.251722 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.254174 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tq979"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.254976 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.255356 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.255688 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.256093 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.259430 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.264594 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5rbk"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.264965 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.265182 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.265563 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.265959 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.266095 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.266315 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.266872 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.269056 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sjmfr"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.272044 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.270035 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.270067 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.276168 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.276555 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.278286 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.278475 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.278580 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.278726 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.278908 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.279004 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.279092 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.279227 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.279315 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.281478 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.282137 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.282761 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.283197 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.283694 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.284073 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.284591 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.284781 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.285070 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.285139 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.285382 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.285172 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.302568 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.303044 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.303255 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.303394 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.303939 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.304117 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.304476 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.304526 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.304712 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.304994 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.305476 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.310368 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.310649 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-z5zrr"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.310662 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.310730 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.310973 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.322925 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.323156 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.323266 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.324003 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.324393 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.324762 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.325083 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.325347 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.325470 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.325617 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.326154 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.326176 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.326836 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4pz4m"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.327025 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.327249 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.327655 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.328167 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.331574 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.332094 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gsfgd"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.332473 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.332686 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.333530 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.337708 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.338165 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-blfn6"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.338664 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.338870 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.340807 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lk65c"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.342158 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.345945 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.347609 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.347731 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-56vb4"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.348522 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-458q5"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.351413 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.351467 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.351477 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l4kcj"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.352913 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxfdf"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353231 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-config\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353253 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-config\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353272 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqj2h\" (UniqueName: \"kubernetes.io/projected/9bb7a73a-5d2e-4134-a0f1-d04e06492022-kube-api-access-cqj2h\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353306 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdbmf\" (UniqueName: \"kubernetes.io/projected/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-kube-api-access-tdbmf\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-config\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353336 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-audit-policies\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353370 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvctz\" (UniqueName: \"kubernetes.io/projected/f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4-kube-api-access-tvctz\") pod \"downloads-7954f5f757-lk65c\" (UID: \"f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4\") " pod="openshift-console/downloads-7954f5f757-lk65c" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353401 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353416 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-trusted-ca\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.353431 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-oauth-config\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.354954 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36509e1a-f77b-4e69-85bf-e018b27205d2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355017 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-client-ca\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rx6v\" (UniqueName: \"kubernetes.io/projected/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-kube-api-access-2rx6v\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355076 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/53a538a2-4a8b-4524-aca4-5eff4f91cce5-audit-dir\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355095 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36509e1a-f77b-4e69-85bf-e018b27205d2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z982\" (UniqueName: \"kubernetes.io/projected/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-kube-api-access-5z982\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cknl4\" (UniqueName: \"kubernetes.io/projected/f0ccd14f-5d77-4541-860f-d834079cf97f-kube-api-access-cknl4\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355157 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/08a32dd3-b775-4153-a505-99b17e1637b1-audit-dir\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355177 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-trusted-ca-bundle\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355248 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-encryption-config\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355264 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb7a73a-5d2e-4134-a0f1-d04e06492022-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355277 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-audit-policies\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-etcd-client\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zcq6\" (UniqueName: \"kubernetes.io/projected/36509e1a-f77b-4e69-85bf-e018b27205d2-kube-api-access-7zcq6\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355360 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-oauth-serving-cert\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355376 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c683385a-b0c6-449e-bc11-b24c3824cb7d-metrics-tls\") pod \"dns-operator-744455d44c-vcvq7\" (UID: \"c683385a-b0c6-449e-bc11-b24c3824cb7d\") " pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355391 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355408 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg695\" (UniqueName: \"kubernetes.io/projected/08a32dd3-b775-4153-a505-99b17e1637b1-kube-api-access-cg695\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355424 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xslb\" (UniqueName: \"kubernetes.io/projected/53a538a2-4a8b-4524-aca4-5eff4f91cce5-kube-api-access-5xslb\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355448 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-serving-cert\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355486 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-serving-cert\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355501 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355516 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbq9k\" (UniqueName: \"kubernetes.io/projected/940c2b58-e113-4dc3-8717-6d6be27a033d-kube-api-access-zbq9k\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355531 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-service-ca-bundle\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-console-config\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355599 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-serving-cert\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355615 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-serving-cert\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355630 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/940c2b58-e113-4dc3-8717-6d6be27a033d-serving-cert\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355644 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-config\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355658 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb7a73a-5d2e-4134-a0f1-d04e06492022-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355677 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-images\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355692 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-service-ca\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355711 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll2dk\" (UniqueName: \"kubernetes.io/projected/c683385a-b0c6-449e-bc11-b24c3824cb7d-kube-api-access-ll2dk\") pod \"dns-operator-744455d44c-vcvq7\" (UID: \"c683385a-b0c6-449e-bc11-b24c3824cb7d\") " pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355726 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.355761 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.356806 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb7a73a-5d2e-4134-a0f1-d04e06492022-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.357116 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-service-ca-bundle\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.357500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-config\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.357677 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-audit-policies\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.357718 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-client-ca\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.358781 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-config\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.359327 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-config\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.359974 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.361635 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.362921 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qdpq8"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.362954 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.362966 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pqkfc"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.363473 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.366703 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-config\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.369178 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-serving-cert\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.371326 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/53a538a2-4a8b-4524-aca4-5eff4f91cce5-audit-dir\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.371617 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/08a32dd3-b775-4153-a505-99b17e1637b1-audit-dir\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.372297 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-console-config\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.372753 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.372782 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wf2br"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.372793 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.377787 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.381000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-images\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.381662 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.381732 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-service-ca\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.383957 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.384644 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-oauth-serving-cert\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.384695 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.384736 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q5npm"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.385783 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.387826 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-audit-policies\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.388542 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-trusted-ca-bundle\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.389878 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sjmfr"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.389974 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q5npm" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.392594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-trusted-ca\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.393593 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.393647 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.393809 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.394696 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-etcd-client\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.395233 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53a538a2-4a8b-4524-aca4-5eff4f91cce5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.396073 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-serving-cert\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.396400 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb7a73a-5d2e-4134-a0f1-d04e06492022-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.396693 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.396851 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.396962 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.397170 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.397995 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.398192 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tq979"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.399137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.399876 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-encryption-config\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.399924 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.400493 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c683385a-b0c6-449e-bc11-b24c3824cb7d-metrics-tls\") pod \"dns-operator-744455d44c-vcvq7\" (UID: \"c683385a-b0c6-449e-bc11-b24c3824cb7d\") " pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.400871 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.400904 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.402385 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.402502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.403350 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vcvq7"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.405354 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.411889 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53a538a2-4a8b-4524-aca4-5eff4f91cce5-serving-cert\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.412176 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/940c2b58-e113-4dc3-8717-6d6be27a033d-serving-cert\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.412425 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.413251 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.413383 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-548t8"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.418043 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.418111 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sqjc8"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.420963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-oauth-config\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.421178 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.429604 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.429626 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.424279 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sqjc8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.430664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-serving-cert\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.435103 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.437133 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5rbk"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.439630 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.445371 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.446677 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.447074 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.449388 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sqjc8"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.449644 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-blfn6"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.454026 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.454064 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4pz4m"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.454131 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gsfgd"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.456066 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q5npm"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.456440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zcq6\" (UniqueName: \"kubernetes.io/projected/36509e1a-f77b-4e69-85bf-e018b27205d2-kube-api-access-7zcq6\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.456550 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36509e1a-f77b-4e69-85bf-e018b27205d2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.456574 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36509e1a-f77b-4e69-85bf-e018b27205d2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.457074 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.458396 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.460013 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ft6z9"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.461599 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ft6z9"] Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.461778 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.462248 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36509e1a-f77b-4e69-85bf-e018b27205d2-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.466989 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36509e1a-f77b-4e69-85bf-e018b27205d2-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.468285 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.486964 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.507611 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.528325 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.547230 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.567194 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.587284 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.607624 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.627337 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.647610 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.667303 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.687669 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.707185 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.727448 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.748137 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.767775 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.787042 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.807621 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.827808 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.847353 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.868194 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.887644 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.907856 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.927756 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.955041 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.967809 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 09:16:57 crc kubenswrapper[4763]: I1201 09:16:57.987696 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.007323 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.027587 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.047959 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.067673 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.087554 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.108680 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.127954 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.148027 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.166696 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.208709 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.247674 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.266925 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.288096 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.307583 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.326093 4763 request.go:700] Waited for 1.000324539s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpprof-cert&limit=500&resourceVersion=0 Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.327381 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.348019 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.367803 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.388252 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.408788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.427212 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.447083 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.468342 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.487851 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.510737 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.528066 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.548114 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.568209 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.588099 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.608101 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.634208 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.648725 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.668366 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.687916 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.707029 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.727774 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.748213 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.767891 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.787373 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.807761 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.826794 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.848901 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.867799 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.887525 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.907752 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.927415 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.948272 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.987293 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rx6v\" (UniqueName: \"kubernetes.io/projected/2307c38a-2af7-4b03-b99a-e5ca5bed76a8-kube-api-access-2rx6v\") pod \"machine-api-operator-5694c8668f-l4kcj\" (UID: \"2307c38a-2af7-4b03-b99a-e5ca5bed76a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:58 crc kubenswrapper[4763]: I1201 09:16:58.999509 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.015107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqj2h\" (UniqueName: \"kubernetes.io/projected/9bb7a73a-5d2e-4134-a0f1-d04e06492022-kube-api-access-cqj2h\") pod \"openshift-controller-manager-operator-756b6f6bc6-9plk4\" (UID: \"9bb7a73a-5d2e-4134-a0f1-d04e06492022\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.026301 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdbmf\" (UniqueName: \"kubernetes.io/projected/c7635c69-b75b-4ecf-89f8-ffb47bf995c5-kube-api-access-tdbmf\") pod \"console-operator-58897d9998-548t8\" (UID: \"c7635c69-b75b-4ecf-89f8-ffb47bf995c5\") " pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.029981 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.047734 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.067858 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.102410 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll2dk\" (UniqueName: \"kubernetes.io/projected/c683385a-b0c6-449e-bc11-b24c3824cb7d-kube-api-access-ll2dk\") pod \"dns-operator-744455d44c-vcvq7\" (UID: \"c683385a-b0c6-449e-bc11-b24c3824cb7d\") " pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.121237 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z982\" (UniqueName: \"kubernetes.io/projected/ae9504b2-987f-4ea3-bed5-7a4b2ea10178-kube-api-access-5z982\") pod \"authentication-operator-69f744f599-56vb4\" (UID: \"ae9504b2-987f-4ea3-bed5-7a4b2ea10178\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.139171 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.147471 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.149003 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cknl4\" (UniqueName: \"kubernetes.io/projected/f0ccd14f-5d77-4541-860f-d834079cf97f-kube-api-access-cknl4\") pod \"console-f9d7485db-458q5\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.169940 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.176609 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-l4kcj"] Dec 01 09:16:59 crc kubenswrapper[4763]: W1201 09:16:59.184550 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2307c38a_2af7_4b03_b99a_e5ca5bed76a8.slice/crio-32f49c51b8208796588c06537344670a2bbd679a9c5a00220965104e87c6c93f WatchSource:0}: Error finding container 32f49c51b8208796588c06537344670a2bbd679a9c5a00220965104e87c6c93f: Status 404 returned error can't find the container with id 32f49c51b8208796588c06537344670a2bbd679a9c5a00220965104e87c6c93f Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.187357 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.208043 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.246176 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvctz\" (UniqueName: \"kubernetes.io/projected/f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4-kube-api-access-tvctz\") pod \"downloads-7954f5f757-lk65c\" (UID: \"f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4\") " pod="openshift-console/downloads-7954f5f757-lk65c" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.266016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg695\" (UniqueName: \"kubernetes.io/projected/08a32dd3-b775-4153-a505-99b17e1637b1-kube-api-access-cg695\") pod \"oauth-openshift-558db77b4-mxfdf\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.272024 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.281036 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.292472 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.309654 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.309690 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbq9k\" (UniqueName: \"kubernetes.io/projected/940c2b58-e113-4dc3-8717-6d6be27a033d-kube-api-access-zbq9k\") pod \"controller-manager-879f6c89f-qdpq8\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.314510 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-548t8"] Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.327848 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.345640 4763 request.go:700] Waited for 1.915549541s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.347163 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.374495 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lk65c" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.387372 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.387594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zcq6\" (UniqueName: \"kubernetes.io/projected/36509e1a-f77b-4e69-85bf-e018b27205d2-kube-api-access-7zcq6\") pod \"openshift-apiserver-operator-796bbdcf4f-pjz7s\" (UID: \"36509e1a-f77b-4e69-85bf-e018b27205d2\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.408149 4763 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.426259 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.428007 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.481447 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-56vb4"] Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.516993 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vcvq7"] Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.759990 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-548t8" event={"ID":"c7635c69-b75b-4ecf-89f8-ffb47bf995c5","Type":"ContainerStarted","Data":"e59d10f566384e5fe250b4968e10c1eb6143e1aebb92a6b8f22dfa2b644cfaa2"} Dec 01 09:16:59 crc kubenswrapper[4763]: I1201 09:16:59.761121 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" event={"ID":"2307c38a-2af7-4b03-b99a-e5ca5bed76a8","Type":"ContainerStarted","Data":"32f49c51b8208796588c06537344670a2bbd679a9c5a00220965104e87c6c93f"} Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.591716 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.593349 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.594431 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.607245 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-tls\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.607769 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-trusted-ca\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.608203 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: E1201 09:17:00.608912 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.108892394 +0000 UTC m=+138.377541172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.611502 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.611601 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l66sm\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-kube-api-access-l66sm\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.611764 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.611946 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-certificates\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.612055 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-bound-sa-token\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.639057 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xslb\" (UniqueName: \"kubernetes.io/projected/53a538a2-4a8b-4524-aca4-5eff4f91cce5-kube-api-access-5xslb\") pod \"apiserver-7bbb656c7d-kv4n2\" (UID: \"53a538a2-4a8b-4524-aca4-5eff4f91cce5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.667578 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4"] Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.713257 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:00 crc kubenswrapper[4763]: E1201 09:17:00.713887 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.213440958 +0000 UTC m=+138.482089736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714429 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2cll\" (UniqueName: \"kubernetes.io/projected/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-kube-api-access-v2cll\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714479 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9hww\" (UniqueName: \"kubernetes.io/projected/f6366d07-d82c-4e35-9c94-946426119bde-kube-api-access-h9hww\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ead8d5dc-6a02-4fd1-8c68-c137fd26bda9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sjmfr\" (UID: \"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714527 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs489\" (UniqueName: \"kubernetes.io/projected/ead8d5dc-6a02-4fd1-8c68-c137fd26bda9-kube-api-access-qs489\") pod \"multus-admission-controller-857f4d67dd-sjmfr\" (UID: \"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714559 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-metrics-tls\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714574 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6366d07-d82c-4e35-9c94-946426119bde-srv-cert\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714588 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a77fb4c4-4741-4f18-aae2-3aefb20448d0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714602 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-config\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714632 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj454\" (UniqueName: \"kubernetes.io/projected/be60db96-5a06-453d-b95e-4637aa61e1f1-kube-api-access-mj454\") pod \"package-server-manager-789f6589d5-t9rl9\" (UID: \"be60db96-5a06-453d-b95e-4637aa61e1f1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714649 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/92de651e-81ec-432a-a942-91b959acb4d2-certs\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714664 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6366d07-d82c-4e35-9c94-946426119bde-profile-collector-cert\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714677 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-config-volume\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714707 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbw2d\" (UniqueName: \"kubernetes.io/projected/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-kube-api-access-pbw2d\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714721 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48c156a-fbc7-481c-96cc-201992569c1e-serving-cert\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714747 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-metrics-certs\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714763 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxx6\" (UniqueName: \"kubernetes.io/projected/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-kube-api-access-9cxx6\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714803 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-serving-cert\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714820 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a77fb4c4-4741-4f18-aae2-3aefb20448d0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714836 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f45a90-202d-4042-96d3-1b24683fc0b6-service-ca-bundle\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714867 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/301fa88d-642c-421f-b8b9-8661393d3ac1-webhook-cert\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714882 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-ca\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714896 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-etcd-serving-ca\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714911 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714953 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6329e56-18d1-4479-8699-897fdfdc60fb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-l2f2g\" (UID: \"f6329e56-18d1-4479-8699-897fdfdc60fb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714972 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48c156a-fbc7-481c-96cc-201992569c1e-config\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.714998 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7bml\" (UniqueName: \"kubernetes.io/projected/c78ac8a1-6273-4efe-9352-82d5aee9e048-kube-api-access-r7bml\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715121 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47303b2b-8915-4828-933b-52f4804bd423-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715144 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-proxy-tls\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715161 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a77fb4c4-4741-4f18-aae2-3aefb20448d0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715272 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22fc92d5-a686-419c-ae28-a87874c0f06f-signing-cabundle\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715288 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/be60db96-5a06-453d-b95e-4637aa61e1f1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t9rl9\" (UID: \"be60db96-5a06-453d-b95e-4637aa61e1f1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715309 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-trusted-ca\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715421 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-certificates\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715441 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-etcd-client\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715906 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wxd7\" (UniqueName: \"kubernetes.io/projected/47303b2b-8915-4828-933b-52f4804bd423-kube-api-access-8wxd7\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.715928 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wsqq\" (UniqueName: \"kubernetes.io/projected/301fa88d-642c-421f-b8b9-8661393d3ac1-kube-api-access-9wsqq\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716055 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffgx2\" (UniqueName: \"kubernetes.io/projected/5e605f4b-743f-42b2-a437-64983f66992b-kube-api-access-ffgx2\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716100 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-config\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c54cf3d9-0427-43dc-816c-a56ce2c56c83-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b7a445-e2e0-4bed-bd90-7015ecfc4645-config\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716501 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-tls\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716587 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t4nd\" (UniqueName: \"kubernetes.io/projected/0d624135-9b60-436b-a1a5-02d6028880ae-kube-api-access-2t4nd\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716609 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtj8f\" (UniqueName: \"kubernetes.io/projected/34ce1090-925c-45cc-b797-a08ddbe3dd98-kube-api-access-xtj8f\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716808 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.716830 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-csi-data-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717016 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwh8r\" (UniqueName: \"kubernetes.io/projected/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-kube-api-access-cwh8r\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717061 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c78ac8a1-6273-4efe-9352-82d5aee9e048-serving-cert\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-encryption-config\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717171 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d624135-9b60-436b-a1a5-02d6028880ae-srv-cert\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717552 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-client\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301fa88d-642c-421f-b8b9-8661393d3ac1-tmpfs\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717615 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717645 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbsjg\" (UniqueName: \"kubernetes.io/projected/dc8d6026-9d55-4b47-8a46-00480cd75fe0-kube-api-access-sbsjg\") pod \"migrator-59844c95c7-6c2mw\" (UID: \"dc8d6026-9d55-4b47-8a46-00480cd75fe0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34ce1090-925c-45cc-b797-a08ddbe3dd98-secret-volume\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717696 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c748802d-9eb3-4f13-80d8-9101979e400e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5s26f\" (UID: \"c748802d-9eb3-4f13-80d8-9101979e400e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717723 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-socket-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717739 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-plugins-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717765 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khg4r\" (UniqueName: \"kubernetes.io/projected/c54cf3d9-0427-43dc-816c-a56ce2c56c83-kube-api-access-khg4r\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717794 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdkb\" (UniqueName: \"kubernetes.io/projected/40081104-3347-4d8a-bfe9-04c6f86948be-kube-api-access-4mdkb\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717811 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm4fx\" (UniqueName: \"kubernetes.io/projected/f6329e56-18d1-4479-8699-897fdfdc60fb-kube-api-access-tm4fx\") pod \"control-plane-machine-set-operator-78cbb6b69f-l2f2g\" (UID: \"f6329e56-18d1-4479-8699-897fdfdc60fb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717828 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-client-ca\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717843 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-service-ca\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717862 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/92de651e-81ec-432a-a942-91b959acb4d2-node-bootstrap-token\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717907 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nql8z\" (UniqueName: \"kubernetes.io/projected/c748802d-9eb3-4f13-80d8-9101979e400e-kube-api-access-nql8z\") pod \"cluster-samples-operator-665b6dd947-5s26f\" (UID: \"c748802d-9eb3-4f13-80d8-9101979e400e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717927 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-default-certificate\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717982 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l66sm\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-kube-api-access-l66sm\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.717996 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-audit\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718030 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/05b7a445-e2e0-4bed-bd90-7015ecfc4645-machine-approver-tls\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718047 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-config\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718065 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e605f4b-743f-42b2-a437-64983f66992b-serving-cert\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718081 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718098 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718114 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c79cs\" (UniqueName: \"kubernetes.io/projected/05b7a445-e2e0-4bed-bd90-7015ecfc4645-kube-api-access-c79cs\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718131 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c78ac8a1-6273-4efe-9352-82d5aee9e048-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718148 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/47303b2b-8915-4828-933b-52f4804bd423-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718166 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a824d314-6a56-4109-83d7-031171aeb8e6-serving-cert\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718181 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6wwj\" (UniqueName: \"kubernetes.io/projected/a824d314-6a56-4109-83d7-031171aeb8e6-kube-api-access-d6wwj\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718194 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-registration-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718213 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22fc92d5-a686-419c-ae28-a87874c0f06f-signing-key\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718227 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd55w\" (UniqueName: \"kubernetes.io/projected/92de651e-81ec-432a-a942-91b959acb4d2-kube-api-access-gd55w\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6850b102-3de9-4180-9897-79a45817954a-cert\") pod \"ingress-canary-q5npm\" (UID: \"6850b102-3de9-4180-9897-79a45817954a\") " pod="openshift-ingress-canary/ingress-canary-q5npm" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718283 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-bound-sa-token\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-image-import-ca\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718314 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718328 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/301fa88d-642c-421f-b8b9-8661393d3ac1-apiservice-cert\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718397 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49jhs\" (UniqueName: \"kubernetes.io/projected/81f45a90-202d-4042-96d3-1b24683fc0b6-kube-api-access-49jhs\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718421 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-metrics-tls\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4ktz\" (UniqueName: \"kubernetes.io/projected/22fc92d5-a686-419c-ae28-a87874c0f06f-kube-api-access-p4ktz\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718492 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34ce1090-925c-45cc-b797-a08ddbe3dd98-config-volume\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718513 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6qq\" (UniqueName: \"kubernetes.io/projected/6850b102-3de9-4180-9897-79a45817954a-kube-api-access-zv6qq\") pod \"ingress-canary-q5npm\" (UID: \"6850b102-3de9-4180-9897-79a45817954a\") " pod="openshift-ingress-canary/ingress-canary-q5npm" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718537 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-trusted-ca\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-config\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718594 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.718633 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mnpv\" (UniqueName: \"kubernetes.io/projected/440365f2-877d-49bd-89c3-0dc4ad54efaa-kube-api-access-2mnpv\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-ca-trust-extracted\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719082 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719118 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719146 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05b7a445-e2e0-4bed-bd90-7015ecfc4645-auth-proxy-config\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719171 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40081104-3347-4d8a-bfe9-04c6f86948be-audit-dir\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719196 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c54cf3d9-0427-43dc-816c-a56ce2c56c83-proxy-tls\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719218 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-stats-auth\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719242 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d624135-9b60-436b-a1a5-02d6028880ae-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719280 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/40081104-3347-4d8a-bfe9-04c6f86948be-node-pullsecrets\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719306 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95rdt\" (UniqueName: \"kubernetes.io/projected/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-kube-api-access-95rdt\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719329 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njkgv\" (UniqueName: \"kubernetes.io/projected/a48c156a-fbc7-481c-96cc-201992569c1e-kube-api-access-njkgv\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719349 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-mountpoint-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-config\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719391 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-images\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.719428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47303b2b-8915-4828-933b-52f4804bd423-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: E1201 09:17:00.719889 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.219874023 +0000 UTC m=+138.488522791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.721495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-trusted-ca\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.728493 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-certificates\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.744672 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-tls\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.751390 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-installation-pull-secrets\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.754656 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l66sm\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-kube-api-access-l66sm\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.759390 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-bound-sa-token\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.798503 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" event={"ID":"9bb7a73a-5d2e-4134-a0f1-d04e06492022","Type":"ContainerStarted","Data":"afa3db87b974831f925b0bc66e25f2dcb43b8fe5cf9dc5dcc8faa3296de9e4c6"} Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.801593 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" event={"ID":"ae9504b2-987f-4ea3-bed5-7a4b2ea10178","Type":"ContainerStarted","Data":"245be4140570dff43f175e9b66fdc55881f2e43faea115241ab1adbe92f3f26b"} Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.805327 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" event={"ID":"c683385a-b0c6-449e-bc11-b24c3824cb7d","Type":"ContainerStarted","Data":"06ec2897ab0791be3a22f08198b3b949ff1d37b0e92e957349b5317524fe9df1"} Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.808217 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" event={"ID":"2307c38a-2af7-4b03-b99a-e5ca5bed76a8","Type":"ContainerStarted","Data":"d75d30a547f039e5a286a97c6d03ab0399b36a48eb3af5a9c6e0868b7805c562"} Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820432 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-serving-cert\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a77fb4c4-4741-4f18-aae2-3aefb20448d0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820515 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f45a90-202d-4042-96d3-1b24683fc0b6-service-ca-bundle\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820534 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/301fa88d-642c-421f-b8b9-8661393d3ac1-webhook-cert\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820550 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-ca\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-etcd-serving-ca\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820579 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820598 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6329e56-18d1-4479-8699-897fdfdc60fb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-l2f2g\" (UID: \"f6329e56-18d1-4479-8699-897fdfdc60fb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48c156a-fbc7-481c-96cc-201992569c1e-config\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7bml\" (UniqueName: \"kubernetes.io/projected/c78ac8a1-6273-4efe-9352-82d5aee9e048-kube-api-access-r7bml\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47303b2b-8915-4828-933b-52f4804bd423-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820686 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-proxy-tls\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a77fb4c4-4741-4f18-aae2-3aefb20448d0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22fc92d5-a686-419c-ae28-a87874c0f06f-signing-cabundle\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820736 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/be60db96-5a06-453d-b95e-4637aa61e1f1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t9rl9\" (UID: \"be60db96-5a06-453d-b95e-4637aa61e1f1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-trusted-ca\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.820767 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-etcd-client\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.821934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81f45a90-202d-4042-96d3-1b24683fc0b6-service-ca-bundle\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wxd7\" (UniqueName: \"kubernetes.io/projected/47303b2b-8915-4828-933b-52f4804bd423-kube-api-access-8wxd7\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822266 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wsqq\" (UniqueName: \"kubernetes.io/projected/301fa88d-642c-421f-b8b9-8661393d3ac1-kube-api-access-9wsqq\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822401 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffgx2\" (UniqueName: \"kubernetes.io/projected/5e605f4b-743f-42b2-a437-64983f66992b-kube-api-access-ffgx2\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-config\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822528 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c54cf3d9-0427-43dc-816c-a56ce2c56c83-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822550 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b7a445-e2e0-4bed-bd90-7015ecfc4645-config\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t4nd\" (UniqueName: \"kubernetes.io/projected/0d624135-9b60-436b-a1a5-02d6028880ae-kube-api-access-2t4nd\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822618 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtj8f\" (UniqueName: \"kubernetes.io/projected/34ce1090-925c-45cc-b797-a08ddbe3dd98-kube-api-access-xtj8f\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822692 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-csi-data-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwh8r\" (UniqueName: \"kubernetes.io/projected/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-kube-api-access-cwh8r\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822774 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c78ac8a1-6273-4efe-9352-82d5aee9e048-serving-cert\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822811 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-encryption-config\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822898 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d624135-9b60-436b-a1a5-02d6028880ae-srv-cert\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822926 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-client\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822941 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301fa88d-642c-421f-b8b9-8661393d3ac1-tmpfs\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.822994 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823018 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbsjg\" (UniqueName: \"kubernetes.io/projected/dc8d6026-9d55-4b47-8a46-00480cd75fe0-kube-api-access-sbsjg\") pod \"migrator-59844c95c7-6c2mw\" (UID: \"dc8d6026-9d55-4b47-8a46-00480cd75fe0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823073 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a48c156a-fbc7-481c-96cc-201992569c1e-config\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823074 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34ce1090-925c-45cc-b797-a08ddbe3dd98-secret-volume\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823161 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c748802d-9eb3-4f13-80d8-9101979e400e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5s26f\" (UID: \"c748802d-9eb3-4f13-80d8-9101979e400e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823184 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-socket-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823203 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-plugins-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823244 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khg4r\" (UniqueName: \"kubernetes.io/projected/c54cf3d9-0427-43dc-816c-a56ce2c56c83-kube-api-access-khg4r\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823270 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdkb\" (UniqueName: \"kubernetes.io/projected/40081104-3347-4d8a-bfe9-04c6f86948be-kube-api-access-4mdkb\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823307 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm4fx\" (UniqueName: \"kubernetes.io/projected/f6329e56-18d1-4479-8699-897fdfdc60fb-kube-api-access-tm4fx\") pod \"control-plane-machine-set-operator-78cbb6b69f-l2f2g\" (UID: \"f6329e56-18d1-4479-8699-897fdfdc60fb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823327 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-client-ca\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823343 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-service-ca\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823360 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823399 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/92de651e-81ec-432a-a942-91b959acb4d2-node-bootstrap-token\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823417 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nql8z\" (UniqueName: \"kubernetes.io/projected/c748802d-9eb3-4f13-80d8-9101979e400e-kube-api-access-nql8z\") pod \"cluster-samples-operator-665b6dd947-5s26f\" (UID: \"c748802d-9eb3-4f13-80d8-9101979e400e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823435 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-default-certificate\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-audit\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823513 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823529 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/05b7a445-e2e0-4bed-bd90-7015ecfc4645-machine-approver-tls\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823558 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-config\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823575 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e605f4b-743f-42b2-a437-64983f66992b-serving-cert\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823593 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823623 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c79cs\" (UniqueName: \"kubernetes.io/projected/05b7a445-e2e0-4bed-bd90-7015ecfc4645-kube-api-access-c79cs\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823650 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c78ac8a1-6273-4efe-9352-82d5aee9e048-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823668 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/47303b2b-8915-4828-933b-52f4804bd423-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823684 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a824d314-6a56-4109-83d7-031171aeb8e6-serving-cert\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823707 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6wwj\" (UniqueName: \"kubernetes.io/projected/a824d314-6a56-4109-83d7-031171aeb8e6-kube-api-access-d6wwj\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823721 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-registration-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22fc92d5-a686-419c-ae28-a87874c0f06f-signing-key\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823756 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823771 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd55w\" (UniqueName: \"kubernetes.io/projected/92de651e-81ec-432a-a942-91b959acb4d2-kube-api-access-gd55w\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823786 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6850b102-3de9-4180-9897-79a45817954a-cert\") pod \"ingress-canary-q5npm\" (UID: \"6850b102-3de9-4180-9897-79a45817954a\") " pod="openshift-ingress-canary/ingress-canary-q5npm" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823804 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-image-import-ca\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823819 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823835 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823861 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/301fa88d-642c-421f-b8b9-8661393d3ac1-apiservice-cert\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49jhs\" (UniqueName: \"kubernetes.io/projected/81f45a90-202d-4042-96d3-1b24683fc0b6-kube-api-access-49jhs\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823903 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-metrics-tls\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823922 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ktz\" (UniqueName: \"kubernetes.io/projected/22fc92d5-a686-419c-ae28-a87874c0f06f-kube-api-access-p4ktz\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823938 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34ce1090-925c-45cc-b797-a08ddbe3dd98-config-volume\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6qq\" (UniqueName: \"kubernetes.io/projected/6850b102-3de9-4180-9897-79a45817954a-kube-api-access-zv6qq\") pod \"ingress-canary-q5npm\" (UID: \"6850b102-3de9-4180-9897-79a45817954a\") " pod="openshift-ingress-canary/ingress-canary-q5npm" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-config\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823984 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.823999 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mnpv\" (UniqueName: \"kubernetes.io/projected/440365f2-877d-49bd-89c3-0dc4ad54efaa-kube-api-access-2mnpv\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824021 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824036 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05b7a445-e2e0-4bed-bd90-7015ecfc4645-auth-proxy-config\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824053 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40081104-3347-4d8a-bfe9-04c6f86948be-audit-dir\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824070 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c54cf3d9-0427-43dc-816c-a56ce2c56c83-proxy-tls\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824084 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-stats-auth\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824099 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d624135-9b60-436b-a1a5-02d6028880ae-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824124 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/40081104-3347-4d8a-bfe9-04c6f86948be-node-pullsecrets\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824140 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95rdt\" (UniqueName: \"kubernetes.io/projected/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-kube-api-access-95rdt\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824156 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njkgv\" (UniqueName: \"kubernetes.io/projected/a48c156a-fbc7-481c-96cc-201992569c1e-kube-api-access-njkgv\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824171 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-mountpoint-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-config\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824206 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-images\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824220 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47303b2b-8915-4828-933b-52f4804bd423-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824257 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2cll\" (UniqueName: \"kubernetes.io/projected/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-kube-api-access-v2cll\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824287 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9hww\" (UniqueName: \"kubernetes.io/projected/f6366d07-d82c-4e35-9c94-946426119bde-kube-api-access-h9hww\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824307 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ead8d5dc-6a02-4fd1-8c68-c137fd26bda9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sjmfr\" (UID: \"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824323 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs489\" (UniqueName: \"kubernetes.io/projected/ead8d5dc-6a02-4fd1-8c68-c137fd26bda9-kube-api-access-qs489\") pod \"multus-admission-controller-857f4d67dd-sjmfr\" (UID: \"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824340 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-metrics-tls\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824354 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6366d07-d82c-4e35-9c94-946426119bde-srv-cert\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824369 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a77fb4c4-4741-4f18-aae2-3aefb20448d0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824385 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-config\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824400 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj454\" (UniqueName: \"kubernetes.io/projected/be60db96-5a06-453d-b95e-4637aa61e1f1-kube-api-access-mj454\") pod \"package-server-manager-789f6589d5-t9rl9\" (UID: \"be60db96-5a06-453d-b95e-4637aa61e1f1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824416 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/92de651e-81ec-432a-a942-91b959acb4d2-certs\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824430 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6366d07-d82c-4e35-9c94-946426119bde-profile-collector-cert\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-config-volume\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824474 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbw2d\" (UniqueName: \"kubernetes.io/projected/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-kube-api-access-pbw2d\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824491 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48c156a-fbc7-481c-96cc-201992569c1e-serving-cert\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-metrics-certs\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824526 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxx6\" (UniqueName: \"kubernetes.io/projected/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-kube-api-access-9cxx6\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.824868 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-etcd-serving-ca\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: E1201 09:17:00.825640 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.325613559 +0000 UTC m=+138.594262367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.825931 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47303b2b-8915-4828-933b-52f4804bd423-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.830504 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b7a445-e2e0-4bed-bd90-7015ecfc4645-config\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.835315 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a77fb4c4-4741-4f18-aae2-3aefb20448d0-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.835478 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-trusted-ca\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.835743 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-csi-data-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.836219 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301fa88d-642c-421f-b8b9-8661393d3ac1-tmpfs\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.839702 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c54cf3d9-0427-43dc-816c-a56ce2c56c83-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.839827 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-config\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.840222 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/22fc92d5-a686-419c-ae28-a87874c0f06f-signing-cabundle\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.856713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-metrics-tls\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.856982 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34ce1090-925c-45cc-b797-a08ddbe3dd98-secret-volume\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.868542 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-socket-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.868626 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-plugins-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.869631 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-client-ca\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.870101 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-service-ca\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.871670 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-serving-cert\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.871950 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-encryption-config\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.874116 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.879436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/92de651e-81ec-432a-a942-91b959acb4d2-node-bootstrap-token\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.881106 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-audit\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.881708 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.886791 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-config\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.889520 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e605f4b-743f-42b2-a437-64983f66992b-serving-cert\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.900300 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.901142 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t4nd\" (UniqueName: \"kubernetes.io/projected/0d624135-9b60-436b-a1a5-02d6028880ae-kube-api-access-2t4nd\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.901573 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/40081104-3347-4d8a-bfe9-04c6f86948be-etcd-client\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.902084 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/be60db96-5a06-453d-b95e-4637aa61e1f1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-t9rl9\" (UID: \"be60db96-5a06-453d-b95e-4637aa61e1f1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.904682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-default-certificate\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.905038 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/05b7a445-e2e0-4bed-bd90-7015ecfc4645-machine-approver-tls\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.905571 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f6329e56-18d1-4479-8699-897fdfdc60fb-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-l2f2g\" (UID: \"f6329e56-18d1-4479-8699-897fdfdc60fb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.905893 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-client\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.906250 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c78ac8a1-6273-4efe-9352-82d5aee9e048-serving-cert\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.912846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtj8f\" (UniqueName: \"kubernetes.io/projected/34ce1090-925c-45cc-b797-a08ddbe3dd98-kube-api-access-xtj8f\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.913395 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c748802d-9eb3-4f13-80d8-9101979e400e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5s26f\" (UID: \"c748802d-9eb3-4f13-80d8-9101979e400e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.913725 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c78ac8a1-6273-4efe-9352-82d5aee9e048-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.926969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.928107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34ce1090-925c-45cc-b797-a08ddbe3dd98-config-volume\") pod \"collect-profiles-29409675-t54dv\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:00 crc kubenswrapper[4763]: E1201 09:17:00.931540 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.43152734 +0000 UTC m=+138.700176108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.932033 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-config\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.932624 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.933832 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.934278 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05b7a445-e2e0-4bed-bd90-7015ecfc4645-auth-proxy-config\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.934317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40081104-3347-4d8a-bfe9-04c6f86948be-audit-dir\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.934328 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a77fb4c4-4741-4f18-aae2-3aefb20448d0-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.937903 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/47303b2b-8915-4828-933b-52f4804bd423-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.937988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.938483 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/40081104-3347-4d8a-bfe9-04c6f86948be-image-import-ca\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.940618 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0d624135-9b60-436b-a1a5-02d6028880ae-srv-cert\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.941374 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-config\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.944145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/40081104-3347-4d8a-bfe9-04c6f86948be-node-pullsecrets\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.944794 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-mountpoint-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.945247 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-registration-dir\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.946227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.947407 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-metrics-tls\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.957758 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-proxy-tls\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.967841 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.968303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/301fa88d-642c-421f-b8b9-8661393d3ac1-webhook-cert\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.969515 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffgx2\" (UniqueName: \"kubernetes.io/projected/5e605f4b-743f-42b2-a437-64983f66992b-kube-api-access-ffgx2\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.970167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0d624135-9b60-436b-a1a5-02d6028880ae-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k99ch\" (UID: \"0d624135-9b60-436b-a1a5-02d6028880ae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.976479 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-images\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.976681 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a824d314-6a56-4109-83d7-031171aeb8e6-etcd-ca\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.977247 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wsqq\" (UniqueName: \"kubernetes.io/projected/301fa88d-642c-421f-b8b9-8661393d3ac1-kube-api-access-9wsqq\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.980078 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwh8r\" (UniqueName: \"kubernetes.io/projected/fbf5149a-5e97-445e-b53d-fc3ef1a0f66b-kube-api-access-cwh8r\") pod \"machine-config-operator-74547568cd-5m58m\" (UID: \"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.985344 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/301fa88d-642c-421f-b8b9-8661393d3ac1-apiservice-cert\") pod \"packageserver-d55dfcdfc-265xp\" (UID: \"301fa88d-642c-421f-b8b9-8661393d3ac1\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.988195 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.989904 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c54cf3d9-0427-43dc-816c-a56ce2c56c83-proxy-tls\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.990369 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a77fb4c4-4741-4f18-aae2-3aefb20448d0-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-g8754\" (UID: \"a77fb4c4-4741-4f18-aae2-3aefb20448d0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.970498 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.993386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-config-volume\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.996759 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a824d314-6a56-4109-83d7-031171aeb8e6-serving-cert\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.997149 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-stats-auth\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:00 crc kubenswrapper[4763]: I1201 09:17:00.999663 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f6366d07-d82c-4e35-9c94-946426119bde-srv-cert\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.000129 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7bml\" (UniqueName: \"kubernetes.io/projected/c78ac8a1-6273-4efe-9352-82d5aee9e048-kube-api-access-r7bml\") pod \"openshift-config-operator-7777fb866f-bmp8l\" (UID: \"c78ac8a1-6273-4efe-9352-82d5aee9e048\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.000547 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ead8d5dc-6a02-4fd1-8c68-c137fd26bda9-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sjmfr\" (UID: \"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.004928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxx6\" (UniqueName: \"kubernetes.io/projected/a9377ab4-04fa-4a1a-afd3-6ab93a78987e-kube-api-access-9cxx6\") pod \"csi-hostpathplugin-ft6z9\" (UID: \"a9377ab4-04fa-4a1a-afd3-6ab93a78987e\") " pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.005329 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a48c156a-fbc7-481c-96cc-201992569c1e-serving-cert\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.016676 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wxd7\" (UniqueName: \"kubernetes.io/projected/47303b2b-8915-4828-933b-52f4804bd423-kube-api-access-8wxd7\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.018179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/22fc92d5-a686-419c-ae28-a87874c0f06f-signing-key\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.022539 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.025951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c79cs\" (UniqueName: \"kubernetes.io/projected/05b7a445-e2e0-4bed-bd90-7015ecfc4645-kube-api-access-c79cs\") pod \"machine-approver-56656f9798-xsvz6\" (UID: \"05b7a445-e2e0-4bed-bd90-7015ecfc4645\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.030812 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.031311 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.031606 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.531591343 +0000 UTC m=+138.800240101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.059053 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81f45a90-202d-4042-96d3-1b24683fc0b6-metrics-certs\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.059717 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khg4r\" (UniqueName: \"kubernetes.io/projected/c54cf3d9-0427-43dc-816c-a56ce2c56c83-kube-api-access-khg4r\") pod \"machine-config-controller-84d6567774-zkwbx\" (UID: \"c54cf3d9-0427-43dc-816c-a56ce2c56c83\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.060311 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49jhs\" (UniqueName: \"kubernetes.io/projected/81f45a90-202d-4042-96d3-1b24683fc0b6-kube-api-access-49jhs\") pod \"router-default-5444994796-z5zrr\" (UID: \"81f45a90-202d-4042-96d3-1b24683fc0b6\") " pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.061808 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbw2d\" (UniqueName: \"kubernetes.io/projected/7f3ab311-db3a-4850-a1dc-bd3af2e3f9da-kube-api-access-pbw2d\") pod \"dns-default-sqjc8\" (UID: \"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da\") " pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.063300 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f6366d07-d82c-4e35-9c94-946426119bde-profile-collector-cert\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.064686 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.065738 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdkb\" (UniqueName: \"kubernetes.io/projected/40081104-3347-4d8a-bfe9-04c6f86948be-kube-api-access-4mdkb\") pod \"apiserver-76f77b778f-wf2br\" (UID: \"40081104-3347-4d8a-bfe9-04c6f86948be\") " pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.066209 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbsjg\" (UniqueName: \"kubernetes.io/projected/dc8d6026-9d55-4b47-8a46-00480cd75fe0-kube-api-access-sbsjg\") pod \"migrator-59844c95c7-6c2mw\" (UID: \"dc8d6026-9d55-4b47-8a46-00480cd75fe0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.066438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95rdt\" (UniqueName: \"kubernetes.io/projected/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-kube-api-access-95rdt\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.067136 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6850b102-3de9-4180-9897-79a45817954a-cert\") pod \"ingress-canary-q5npm\" (UID: \"6850b102-3de9-4180-9897-79a45817954a\") " pod="openshift-ingress-canary/ingress-canary-q5npm" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.070180 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/92de651e-81ec-432a-a942-91b959acb4d2-certs\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.070196 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nql8z\" (UniqueName: \"kubernetes.io/projected/c748802d-9eb3-4f13-80d8-9101979e400e-kube-api-access-nql8z\") pod \"cluster-samples-operator-665b6dd947-5s26f\" (UID: \"c748802d-9eb3-4f13-80d8-9101979e400e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.071363 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67dab1e-ade9-4a36-8e71-8d6fc206d0b5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6mbc\" (UID: \"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.072818 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/09f0c6f8-b1c4-4086-9073-2a77ce3a6191-bound-sa-token\") pod \"ingress-operator-5b745b69d9-6v56c\" (UID: \"09f0c6f8-b1c4-4086-9073-2a77ce3a6191\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.077755 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6qq\" (UniqueName: \"kubernetes.io/projected/6850b102-3de9-4180-9897-79a45817954a-kube-api-access-zv6qq\") pod \"ingress-canary-q5npm\" (UID: \"6850b102-3de9-4180-9897-79a45817954a\") " pod="openshift-ingress-canary/ingress-canary-q5npm" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.078079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-config\") pod \"route-controller-manager-6576b87f9c-s7nl7\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.079253 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2cll\" (UniqueName: \"kubernetes.io/projected/cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6-kube-api-access-v2cll\") pod \"kube-storage-version-migrator-operator-b67b599dd-zg7dc\" (UID: \"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.083388 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gwr6q\" (UID: \"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.083736 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.084443 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm4fx\" (UniqueName: \"kubernetes.io/projected/f6329e56-18d1-4479-8699-897fdfdc60fb-kube-api-access-tm4fx\") pod \"control-plane-machine-set-operator-78cbb6b69f-l2f2g\" (UID: \"f6329e56-18d1-4479-8699-897fdfdc60fb\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.087716 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mnpv\" (UniqueName: \"kubernetes.io/projected/440365f2-877d-49bd-89c3-0dc4ad54efaa-kube-api-access-2mnpv\") pod \"marketplace-operator-79b997595-4pz4m\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.090235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4ktz\" (UniqueName: \"kubernetes.io/projected/22fc92d5-a686-419c-ae28-a87874c0f06f-kube-api-access-p4ktz\") pod \"service-ca-9c57cc56f-gsfgd\" (UID: \"22fc92d5-a686-419c-ae28-a87874c0f06f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.092767 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs489\" (UniqueName: \"kubernetes.io/projected/ead8d5dc-6a02-4fd1-8c68-c137fd26bda9-kube-api-access-qs489\") pod \"multus-admission-controller-857f4d67dd-sjmfr\" (UID: \"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.095842 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9hww\" (UniqueName: \"kubernetes.io/projected/f6366d07-d82c-4e35-9c94-946426119bde-kube-api-access-h9hww\") pod \"catalog-operator-68c6474976-ncq7s\" (UID: \"f6366d07-d82c-4e35-9c94-946426119bde\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.100280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6wwj\" (UniqueName: \"kubernetes.io/projected/a824d314-6a56-4109-83d7-031171aeb8e6-kube-api-access-d6wwj\") pod \"etcd-operator-b45778765-tq979\" (UID: \"a824d314-6a56-4109-83d7-031171aeb8e6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.100827 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njkgv\" (UniqueName: \"kubernetes.io/projected/a48c156a-fbc7-481c-96cc-201992569c1e-kube-api-access-njkgv\") pod \"service-ca-operator-777779d784-blfn6\" (UID: \"a48c156a-fbc7-481c-96cc-201992569c1e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.100977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd55w\" (UniqueName: \"kubernetes.io/projected/92de651e-81ec-432a-a942-91b959acb4d2-kube-api-access-gd55w\") pod \"machine-config-server-pqkfc\" (UID: \"92de651e-81ec-432a-a942-91b959acb4d2\") " pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.101551 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj454\" (UniqueName: \"kubernetes.io/projected/be60db96-5a06-453d-b95e-4637aa61e1f1-kube-api-access-mj454\") pod \"package-server-manager-789f6589d5-t9rl9\" (UID: \"be60db96-5a06-453d-b95e-4637aa61e1f1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.104917 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47303b2b-8915-4828-933b-52f4804bd423-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-76x97\" (UID: \"47303b2b-8915-4828-933b-52f4804bd423\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.106288 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.116322 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.126344 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.127429 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.133275 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.133842 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.134183 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.634150114 +0000 UTC m=+138.902798882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.148722 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.162272 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.163835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.178193 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.186083 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.196780 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.205072 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.214073 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.222851 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.232834 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.235125 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.235555 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.735540142 +0000 UTC m=+139.004188910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.242324 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.249160 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.258712 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.265659 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.273559 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.292094 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.306141 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.321099 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.331493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pqkfc" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.337086 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.338536 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.838508783 +0000 UTC m=+139.107157601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.365466 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q5npm" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.444439 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.444662 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.94463602 +0000 UTC m=+139.213284788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.444839 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.445243 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:01.945231007 +0000 UTC m=+139.213879775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.545634 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.545973 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.045958608 +0000 UTC m=+139.314607376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.549409 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qdpq8"] Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.563421 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-458q5"] Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.634767 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s"] Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.654236 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.654821 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.154811028 +0000 UTC m=+139.423459796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.758717 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxfdf"] Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.769510 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.769614 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.26959734 +0000 UTC m=+139.538246108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.769885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.770268 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.270258469 +0000 UTC m=+139.538907237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.774276 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lk65c"] Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.821004 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2"] Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.865598 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" event={"ID":"2307c38a-2af7-4b03-b99a-e5ca5bed76a8","Type":"ContainerStarted","Data":"eaafcfdbe03d2f18c7e826eaf8ed600a988c6ac0410c2f9e8a820f59c009973d"} Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.875429 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.876039 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.376020005 +0000 UTC m=+139.644668773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.877055 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" event={"ID":"05b7a445-e2e0-4bed-bd90-7015ecfc4645","Type":"ContainerStarted","Data":"47b9b680fa37788ca07bfa771a6ec4e5cc8bb571ac31139fcccc26d53cded833"} Dec 01 09:17:01 crc kubenswrapper[4763]: W1201 09:17:01.883654 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36509e1a_f77b_4e69_85bf_e018b27205d2.slice/crio-9082ed506f5d2b849f4edfb8c8845ca5a8e353bf2f8984ea392610b90becaa8f WatchSource:0}: Error finding container 9082ed506f5d2b849f4edfb8c8845ca5a8e353bf2f8984ea392610b90becaa8f: Status 404 returned error can't find the container with id 9082ed506f5d2b849f4edfb8c8845ca5a8e353bf2f8984ea392610b90becaa8f Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.904139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-548t8" event={"ID":"c7635c69-b75b-4ecf-89f8-ffb47bf995c5","Type":"ContainerStarted","Data":"c33ee8e6f4b3e6d62df65e92de73aec382670a9f8f71510c60ddbcd222a1155b"} Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.910229 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" event={"ID":"ae9504b2-987f-4ea3-bed5-7a4b2ea10178","Type":"ContainerStarted","Data":"8918ea7c23a311a19843ec5160fc285096c11da6b3aa2e75a5696a3d5557bdc8"} Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.913727 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.937888 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-l4kcj" podStartSLOduration=119.937870882 podStartE2EDuration="1m59.937870882s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:01.901024413 +0000 UTC m=+139.169673181" watchObservedRunningTime="2025-12-01 09:17:01.937870882 +0000 UTC m=+139.206519650" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.946093 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-548t8" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.948628 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-548t8" podStartSLOduration=120.948614773 podStartE2EDuration="2m0.948614773s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:01.940713989 +0000 UTC m=+139.209362777" watchObservedRunningTime="2025-12-01 09:17:01.948614773 +0000 UTC m=+139.217263541" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.952608 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" event={"ID":"c683385a-b0c6-449e-bc11-b24c3824cb7d","Type":"ContainerStarted","Data":"b17e42f08cec6dbffca358856d5536e57493f23de73e397a3b22ce8329b7f719"} Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.973140 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-56vb4" podStartSLOduration=120.973123148 podStartE2EDuration="2m0.973123148s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:01.964597337 +0000 UTC m=+139.233246105" watchObservedRunningTime="2025-12-01 09:17:01.973123148 +0000 UTC m=+139.241771916" Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.977608 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:01 crc kubenswrapper[4763]: E1201 09:17:01.977948 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.477934559 +0000 UTC m=+139.746583327 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:01 crc kubenswrapper[4763]: I1201 09:17:01.988144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" event={"ID":"9bb7a73a-5d2e-4134-a0f1-d04e06492022","Type":"ContainerStarted","Data":"2aa99fc67cbeec43f7fedeee12850e413fb34972f01631836816ca86165abf71"} Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.082343 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.083867 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.58384794 +0000 UTC m=+139.852496708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.084852 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9plk4" podStartSLOduration=121.084839047 podStartE2EDuration="2m1.084839047s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:02.075820082 +0000 UTC m=+139.344468850" watchObservedRunningTime="2025-12-01 09:17:02.084839047 +0000 UTC m=+139.353487815" Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.152611 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.166652 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.170373 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sqjc8"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.184571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.185111 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.685095825 +0000 UTC m=+139.953744593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.247809 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ft6z9"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.260136 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.285691 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.286066 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.786050092 +0000 UTC m=+140.054698860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.291611 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tq979"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.333723 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.387154 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.387569 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.887544293 +0000 UTC m=+140.156193161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.488879 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.489384 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:02.989370003 +0000 UTC m=+140.258018771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.590901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.591240 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.091227075 +0000 UTC m=+140.359875843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.691637 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.691846 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.191816322 +0000 UTC m=+140.460465090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.692052 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.692441 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.192429308 +0000 UTC m=+140.461078076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.716627 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wf2br"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.733610 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.762027 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f"] Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.794131 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.794504 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.294486735 +0000 UTC m=+140.563135503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.898255 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.898593 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.398578466 +0000 UTC m=+140.667227234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.999156 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:02 crc kubenswrapper[4763]: E1201 09:17:02.999497 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.499438831 +0000 UTC m=+140.768087649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:02 crc kubenswrapper[4763]: I1201 09:17:02.999676 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.000096 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.500085629 +0000 UTC m=+140.768734417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.099287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sqjc8" event={"ID":"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da","Type":"ContainerStarted","Data":"92d79e170ba0df711d73a60ae6bfed3812212118dd2f50f2284606f01528c9c9"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.106258 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.106718 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.606700359 +0000 UTC m=+140.875349127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.110104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" event={"ID":"36509e1a-f77b-4e69-85bf-e018b27205d2","Type":"ContainerStarted","Data":"b4c71ae3105a3876c6129b152e43ef9f04513ba052575a3e5b4edbdb177fdd96"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.110155 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" event={"ID":"36509e1a-f77b-4e69-85bf-e018b27205d2","Type":"ContainerStarted","Data":"9082ed506f5d2b849f4edfb8c8845ca5a8e353bf2f8984ea392610b90becaa8f"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.113237 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" event={"ID":"08a32dd3-b775-4153-a505-99b17e1637b1","Type":"ContainerStarted","Data":"40b3107e6a9004e097cf8bcb0fae0068f2df5a4132af76ab6e26f761b919ae07"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.135043 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.190586 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sjmfr"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.192306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" event={"ID":"34ce1090-925c-45cc-b797-a08ddbe3dd98","Type":"ContainerStarted","Data":"603ddd5a449afbd2e8419a37fa23bb1544ca30797a7fbb86e9baefeb5c5b3ef3"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.211377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.219031 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.718994404 +0000 UTC m=+140.987643172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.264715 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.269934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-458q5" event={"ID":"f0ccd14f-5d77-4541-860f-d834079cf97f","Type":"ContainerStarted","Data":"2eab0facfcd9cd78e1730283d2893dfdb96618082a7fd0468ec6aebf59b5b5ec"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.269972 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-458q5" event={"ID":"f0ccd14f-5d77-4541-860f-d834079cf97f","Type":"ContainerStarted","Data":"061f789e392c10e7cac7f92148a23e57d121bb2bffa97a7cddf22188524862f3"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.272296 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.276597 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.283610 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.289680 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.320893 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.321924 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.821907323 +0000 UTC m=+141.090556091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.336873 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.343695 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" event={"ID":"a824d314-6a56-4109-83d7-031171aeb8e6","Type":"ContainerStarted","Data":"63022e0ec4e31bcd4576f2176e8b0282e20598f304f339625c4fd745003f3c6e"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.372273 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4pz4m"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.373552 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.378595 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.387674 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q5npm"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.394311 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.395586 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-blfn6"] Dec 01 09:17:03 crc kubenswrapper[4763]: W1201 09:17:03.397234 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead8d5dc_6a02_4fd1_8c68_c137fd26bda9.slice/crio-984715cb4881a33dd479c659b65074784c2e202b6d955981206e2621265f5b33 WatchSource:0}: Error finding container 984715cb4881a33dd479c659b65074784c2e202b6d955981206e2621265f5b33: Status 404 returned error can't find the container with id 984715cb4881a33dd479c659b65074784c2e202b6d955981206e2621265f5b33 Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.408498 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.410366 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.415521 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gsfgd"] Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.422487 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.422887 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:03.922871761 +0000 UTC m=+141.191520529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.439682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" event={"ID":"301fa88d-642c-421f-b8b9-8661393d3ac1","Type":"ContainerStarted","Data":"971bae346ed171d9099b9c803bf01429dac148d31c703f123c976079c852be95"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.459665 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" event={"ID":"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b","Type":"ContainerStarted","Data":"54af5d7323122cc3c278c011c9b3394f84744be5691f51e6a19afaf856be4b89"} Dec 01 09:17:03 crc kubenswrapper[4763]: W1201 09:17:03.474114 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc8d6026_9d55_4b47_8a46_00480cd75fe0.slice/crio-d54c9b3d7e2337ae0321ab77b9e4c21c919870a253c4c3fd4dd9ef9932c1d34b WatchSource:0}: Error finding container d54c9b3d7e2337ae0321ab77b9e4c21c919870a253c4c3fd4dd9ef9932c1d34b: Status 404 returned error can't find the container with id d54c9b3d7e2337ae0321ab77b9e4c21c919870a253c4c3fd4dd9ef9932c1d34b Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.497511 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pqkfc" event={"ID":"92de651e-81ec-432a-a942-91b959acb4d2","Type":"ContainerStarted","Data":"c9282fb4445662864a3022b656cb2a079077e4494220c91d4a9e3dae6ee7f94a"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.524224 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.525105 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.025088241 +0000 UTC m=+141.293737009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.586146 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" event={"ID":"53a538a2-4a8b-4524-aca4-5eff4f91cce5","Type":"ContainerStarted","Data":"400753ee1b506abab9d94eefac68fcd8d0662931accc7f6e528d95f712804ddc"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.591272 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" event={"ID":"5e605f4b-743f-42b2-a437-64983f66992b","Type":"ContainerStarted","Data":"2dfb3f6e37e8453957e6d73555c517d1e897cb0c30b203a463ad0286a43eb86e"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.592994 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.600755 4763 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-s7nl7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.600804 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" podUID="5e605f4b-743f-42b2-a437-64983f66992b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.620064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lk65c" event={"ID":"f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4","Type":"ContainerStarted","Data":"139d02e810e697d2d98b3828cf50c9be7e433d54e74c2a9770281e2c907175a5"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.620110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lk65c" event={"ID":"f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4","Type":"ContainerStarted","Data":"6aa33b8754da537274c5ac9d2f6a19f576a0dfc5405942c63dbfd056a9ddd1b6"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.621316 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lk65c" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.622281 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-lk65c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.622320 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lk65c" podUID="f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.631214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.631491 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.131479736 +0000 UTC m=+141.400128504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.673293 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" event={"ID":"0d624135-9b60-436b-a1a5-02d6028880ae","Type":"ContainerStarted","Data":"5077783ba106f359bbfe60b0aea95d6168f719526be00de8fb74b3b4563a164a"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.674076 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.678762 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-z5zrr" event={"ID":"81f45a90-202d-4042-96d3-1b24683fc0b6","Type":"ContainerStarted","Data":"50f69cd24d844fda4e47738a2d910265befc99eb5272efc786158626137a34fe"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.678805 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-z5zrr" event={"ID":"81f45a90-202d-4042-96d3-1b24683fc0b6","Type":"ContainerStarted","Data":"46bb26b7d39a078d37cf3fc69dc6daa76fd7493edf99cc9a9835ee07e3d70015"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.686123 4763 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k99ch container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.686170 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" podUID="0d624135-9b60-436b-a1a5-02d6028880ae" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.699253 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" event={"ID":"40081104-3347-4d8a-bfe9-04c6f86948be","Type":"ContainerStarted","Data":"6cf34b4d9ae4fac06bb686fa0f2e928b1cfbe2d6b8869ac260ac99ffa269d4ff"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.708168 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-458q5" podStartSLOduration=122.708150444 podStartE2EDuration="2m2.708150444s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:03.70686979 +0000 UTC m=+140.975518558" watchObservedRunningTime="2025-12-01 09:17:03.708150444 +0000 UTC m=+140.976799212" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.709679 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pjz7s" podStartSLOduration=122.709671416 podStartE2EDuration="2m2.709671416s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:03.669579109 +0000 UTC m=+140.938227877" watchObservedRunningTime="2025-12-01 09:17:03.709671416 +0000 UTC m=+140.978320184" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.731791 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.732921 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.232906535 +0000 UTC m=+141.501555303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.773628 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" podStartSLOduration=121.773614369 podStartE2EDuration="2m1.773614369s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:03.772465827 +0000 UTC m=+141.041114595" watchObservedRunningTime="2025-12-01 09:17:03.773614369 +0000 UTC m=+141.042263137" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.823130 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-z5zrr" podStartSLOduration=122.823115371 podStartE2EDuration="2m2.823115371s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:03.822085653 +0000 UTC m=+141.090734431" watchObservedRunningTime="2025-12-01 09:17:03.823115371 +0000 UTC m=+141.091764139" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.833571 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.833938 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.333926844 +0000 UTC m=+141.602575612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.838784 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" event={"ID":"05b7a445-e2e0-4bed-bd90-7015ecfc4645","Type":"ContainerStarted","Data":"f76a43048a1415966d882a9a0fecf076597cf429066aeb8a0e3e864dfa1bc1b4"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.858085 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lk65c" podStartSLOduration=122.858068879 podStartE2EDuration="2m2.858068879s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:03.854344488 +0000 UTC m=+141.122993256" watchObservedRunningTime="2025-12-01 09:17:03.858068879 +0000 UTC m=+141.126717647" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.865256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" event={"ID":"a9377ab4-04fa-4a1a-afd3-6ab93a78987e","Type":"ContainerStarted","Data":"c047dc4b3ca46b5ff9137421acbc65d546e2ad660d2c7ae8e2e368846de35690"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.890703 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" podStartSLOduration=121.890685113 podStartE2EDuration="2m1.890685113s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:03.889758307 +0000 UTC m=+141.158407075" watchObservedRunningTime="2025-12-01 09:17:03.890685113 +0000 UTC m=+141.159333881" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.921485 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" event={"ID":"940c2b58-e113-4dc3-8717-6d6be27a033d","Type":"ContainerStarted","Data":"7ac6af7c882815e3b8611f5d2ef5e310e87561295f3e3f82722aacb37b0f2513"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.921529 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" event={"ID":"940c2b58-e113-4dc3-8717-6d6be27a033d","Type":"ContainerStarted","Data":"2f175d6b4d31143bd1e8dc72aa37087ea7e1e2ac372c9c505a5e50f7544e8635"} Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.931498 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.931654 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.934876 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:03 crc kubenswrapper[4763]: E1201 09:17:03.935298 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.435280781 +0000 UTC m=+141.703929549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:03 crc kubenswrapper[4763]: I1201 09:17:03.965113 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" podStartSLOduration=122.96509715 podStartE2EDuration="2m2.96509715s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:03.96399675 +0000 UTC m=+141.232645508" watchObservedRunningTime="2025-12-01 09:17:03.96509715 +0000 UTC m=+141.233745918" Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.039137 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.041633 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.541620364 +0000 UTC m=+141.810269132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.144493 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.145089 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.645075899 +0000 UTC m=+141.913724667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.249144 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.249426 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.749416098 +0000 UTC m=+142.018064866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.249794 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.258645 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:04 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:04 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:04 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.258698 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.355253 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.355610 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.855593676 +0000 UTC m=+142.124242444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.456773 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.457101 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:04.957089778 +0000 UTC m=+142.225738546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.558031 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.558280 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.058258621 +0000 UTC m=+142.326907389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.558628 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.558952 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.058935789 +0000 UTC m=+142.327584557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.660049 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.660416 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.160402459 +0000 UTC m=+142.429051227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.761075 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.761379 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.261360876 +0000 UTC m=+142.530009694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.862553 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.862953 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.362938941 +0000 UTC m=+142.631587709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.942735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" event={"ID":"47303b2b-8915-4828-933b-52f4804bd423","Type":"ContainerStarted","Data":"59a3ce158db67346814cb55454972416752cacdcaed048f4eebca0b7f810ee30"} Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.964377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:04 crc kubenswrapper[4763]: E1201 09:17:04.964703 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.464692119 +0000 UTC m=+142.733340887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:04 crc kubenswrapper[4763]: I1201 09:17:04.972226 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" event={"ID":"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6","Type":"ContainerStarted","Data":"cb9f2867499e19c2a66b6af8f0589479580c99ebd211b384bb9cae0687a7db3e"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.069537 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.069719 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.569693915 +0000 UTC m=+142.838342683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.069814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.070122 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.570113837 +0000 UTC m=+142.838762685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.085547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" event={"ID":"c748802d-9eb3-4f13-80d8-9101979e400e","Type":"ContainerStarted","Data":"6be7ad0e4ee8c5ebb3d723efba6e535ef93224022fb050d47ccc1d0e9698d3b3"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.085584 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" event={"ID":"c748802d-9eb3-4f13-80d8-9101979e400e","Type":"ContainerStarted","Data":"e1bc3c56cb471fec67e7af17319d0c457e2a5cbe8d69f6554f07e5d153094902"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.086922 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" event={"ID":"a9377ab4-04fa-4a1a-afd3-6ab93a78987e","Type":"ContainerStarted","Data":"e6e1679976cfb898e058223ab92a94b0443855d0b063a3e98f96014388f4046f"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.170981 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.171425 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.671411123 +0000 UTC m=+142.940059891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.172867 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" event={"ID":"22fc92d5-a686-419c-ae28-a87874c0f06f","Type":"ContainerStarted","Data":"c6bd99dc30ad4ea59b2a615b1438c7c64aec35fd1e46514e8401c8efa1ed9e9c"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.194078 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" event={"ID":"c683385a-b0c6-449e-bc11-b24c3824cb7d","Type":"ContainerStarted","Data":"473b1c57a56a2647ac128d22ec661d174cb22ee3f8231dc0b7a1c0441cdad075"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.197511 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" event={"ID":"a824d314-6a56-4109-83d7-031171aeb8e6","Type":"ContainerStarted","Data":"485d8ef47fd09895013a32581f7253930dfdfab9a09694af8df7a2a47646850e"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.199803 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" event={"ID":"5e605f4b-743f-42b2-a437-64983f66992b","Type":"ContainerStarted","Data":"0151da93d83d86563ff3b50513aed9e54ee8664c22c7c910a3e79035e35a5f5e"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.203832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" event={"ID":"0d624135-9b60-436b-a1a5-02d6028880ae","Type":"ContainerStarted","Data":"0312d11765d9c2dd1160760ea8db04000f07cc67bed2dac6958b2c8e9fd16083"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.206443 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.207696 4763 generic.go:334] "Generic (PLEG): container finished" podID="53a538a2-4a8b-4524-aca4-5eff4f91cce5" containerID="080abe7d53741b6f15092108d5f9cf2d7f9036262efbd9cb68c113c7012531ab" exitCode=0 Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.207742 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" event={"ID":"53a538a2-4a8b-4524-aca4-5eff4f91cce5","Type":"ContainerDied","Data":"080abe7d53741b6f15092108d5f9cf2d7f9036262efbd9cb68c113c7012531ab"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.217841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" event={"ID":"08a32dd3-b775-4153-a505-99b17e1637b1","Type":"ContainerStarted","Data":"5457caff952e4a23d58b648667e2f9f4917f7e55ee076208a69a333323e430f2"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.218714 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.219402 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vcvq7" podStartSLOduration=124.219384794 podStartE2EDuration="2m4.219384794s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:05.216476415 +0000 UTC m=+142.485125183" watchObservedRunningTime="2025-12-01 09:17:05.219384794 +0000 UTC m=+142.488033562" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.226950 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" event={"ID":"09f0c6f8-b1c4-4086-9073-2a77ce3a6191","Type":"ContainerStarted","Data":"dba518ed5039327666f5bee64d48ee04ecfae72123aea81a6a03089e0c4dbd17"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.235584 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.237999 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pqkfc" event={"ID":"92de651e-81ec-432a-a942-91b959acb4d2","Type":"ContainerStarted","Data":"cef8d76d9ebd0a5d0e93979f4e98683961b1dadbb8b6f10b9198d24de77271bc"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.248169 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" event={"ID":"f6366d07-d82c-4e35-9c94-946426119bde","Type":"ContainerStarted","Data":"c02eac3d729f04aa870d388de0895617c2f9d961c2a95e0ee0fe58b71fde3e72"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.259774 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:05 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:05 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:05 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.259824 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.282728 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" event={"ID":"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9","Type":"ContainerStarted","Data":"984715cb4881a33dd479c659b65074784c2e202b6d955981206e2621265f5b33"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.284206 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.286852 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.786838612 +0000 UTC m=+143.055487470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.298690 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" event={"ID":"f6329e56-18d1-4479-8699-897fdfdc60fb","Type":"ContainerStarted","Data":"362dcd6d50c9c7661d51d5f7d13d1c1c78ed028fbb0427f948d196929993252c"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.317317 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k99ch" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.365442 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tq979" podStartSLOduration=124.365426783 podStartE2EDuration="2m4.365426783s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:05.3214357 +0000 UTC m=+142.590084468" watchObservedRunningTime="2025-12-01 09:17:05.365426783 +0000 UTC m=+142.634075551" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.380297 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" event={"ID":"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b","Type":"ContainerStarted","Data":"dd5dad9c6721cba726aef7e2a19db1abc138a6f9b21fde3201336e1dd9ba989c"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.385949 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" event={"ID":"a48c156a-fbc7-481c-96cc-201992569c1e","Type":"ContainerStarted","Data":"9c1841a6706392b66384645c904dc1980e56447c7e1f1d87453a6521a562b9c9"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.392028 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.392403 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.892385224 +0000 UTC m=+143.161033992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.430028 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" event={"ID":"be60db96-5a06-453d-b95e-4637aa61e1f1","Type":"ContainerStarted","Data":"fb82434b8721cf9d5e9a5b0e07436f6e50d09f01b9ab6d8ff57e1549239357b5"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.451294 4763 generic.go:334] "Generic (PLEG): container finished" podID="40081104-3347-4d8a-bfe9-04c6f86948be" containerID="cbf74e6ad64ab7b131a9bb80fbcd46fc30c5f853721f47f8e207318526deba63" exitCode=0 Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.451377 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" event={"ID":"40081104-3347-4d8a-bfe9-04c6f86948be","Type":"ContainerDied","Data":"cbf74e6ad64ab7b131a9bb80fbcd46fc30c5f853721f47f8e207318526deba63"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.496353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.497199 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:05.997187075 +0000 UTC m=+143.265835843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.498751 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pqkfc" podStartSLOduration=8.498734537 podStartE2EDuration="8.498734537s" podCreationTimestamp="2025-12-01 09:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:05.49777554 +0000 UTC m=+142.766424298" watchObservedRunningTime="2025-12-01 09:17:05.498734537 +0000 UTC m=+142.767383295" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.530107 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" event={"ID":"301fa88d-642c-421f-b8b9-8661393d3ac1","Type":"ContainerStarted","Data":"5ee846ab4fa49f0cfe86e9985617dcb6c5d65a97ec8109330b639a0687817064"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.531055 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.592903 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" event={"ID":"dc8d6026-9d55-4b47-8a46-00480cd75fe0","Type":"ContainerStarted","Data":"d54c9b3d7e2337ae0321ab77b9e4c21c919870a253c4c3fd4dd9ef9932c1d34b"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.601174 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.604026 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.10399992 +0000 UTC m=+143.372648688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.605815 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" event={"ID":"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea","Type":"ContainerStarted","Data":"716aae42ce62d0bcfaff0e27f3086d9c361606f230a4a3e1e39bfda431786908"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.624638 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" podStartSLOduration=124.624619399 podStartE2EDuration="2m4.624619399s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:05.552274608 +0000 UTC m=+142.820923376" watchObservedRunningTime="2025-12-01 09:17:05.624619399 +0000 UTC m=+142.893268167" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.625674 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" event={"ID":"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5","Type":"ContainerStarted","Data":"6d50c33cdff75b0f68711466649c3c9aaa872d1845d6181e18232eaff1e8781f"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.642517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" event={"ID":"c78ac8a1-6273-4efe-9352-82d5aee9e048","Type":"ContainerStarted","Data":"23e322737e30cde7109757e88eaed603a5114689bc616020f8c066a52bfed575"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.642571 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" event={"ID":"c78ac8a1-6273-4efe-9352-82d5aee9e048","Type":"ContainerStarted","Data":"f8c0b171f4deae31f28f37406e563c5c306e9a1abbf0467416f2d2431d9009d1"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.658685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sqjc8" event={"ID":"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da","Type":"ContainerStarted","Data":"1884f01bf482056159a683221c1a3e025bc086047c20ff89a557431e78703590"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.678977 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" podStartSLOduration=123.678962043 podStartE2EDuration="2m3.678962043s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:05.676048954 +0000 UTC m=+142.944697712" watchObservedRunningTime="2025-12-01 09:17:05.678962043 +0000 UTC m=+142.947610811" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.691005 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q5npm" event={"ID":"6850b102-3de9-4180-9897-79a45817954a","Type":"ContainerStarted","Data":"261fc0b31beda791f8f777e4b144b4cfe2c7f17cc10d592fe5c5da1fc20f1704"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.703030 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" event={"ID":"a77fb4c4-4741-4f18-aae2-3aefb20448d0","Type":"ContainerStarted","Data":"0cf4df774035568c859ceb05d26191be4379fc0b79662863f3352f2f168120a7"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.703843 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.704095 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.204084674 +0000 UTC m=+143.472733442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.753538 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" event={"ID":"34ce1090-925c-45cc-b797-a08ddbe3dd98","Type":"ContainerStarted","Data":"386b20a38dd799fc39d9e66afa072e549e8356a96ece9fe4dace2888470eb5d6"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.783319 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.799838 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" event={"ID":"c54cf3d9-0427-43dc-816c-a56ce2c56c83","Type":"ContainerStarted","Data":"dbfc72ed59e1b8b248055d6e8b974d5d290af4aa75d10b523cdaa0eaea8ab6cc"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.807081 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.807680 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.307665492 +0000 UTC m=+143.576314260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.860844 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" event={"ID":"05b7a445-e2e0-4bed-bd90-7015ecfc4645","Type":"ContainerStarted","Data":"244f3186b8c1b2a92de8b5ea418f345e90dea500b8af13e0ea2469d63c7657ef"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.884038 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" event={"ID":"440365f2-877d-49bd-89c3-0dc4ad54efaa","Type":"ContainerStarted","Data":"06611798f10fac70e29aac80c1cf7acb414141fb41c1a1cfa208b1e3ab1df859"} Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.884348 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-lk65c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.884393 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lk65c" podUID="f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.884430 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.911353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:05 crc kubenswrapper[4763]: E1201 09:17:05.918566 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.418550087 +0000 UTC m=+143.687198855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.927959 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:17:05 crc kubenswrapper[4763]: I1201 09:17:05.970794 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" podStartSLOduration=124.970773113 podStartE2EDuration="2m4.970773113s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:05.81277219 +0000 UTC m=+143.081420958" watchObservedRunningTime="2025-12-01 09:17:05.970773113 +0000 UTC m=+143.239421881" Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.017949 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xsvz6" podStartSLOduration=125.017934322 podStartE2EDuration="2m5.017934322s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:06.017271064 +0000 UTC m=+143.285919832" watchObservedRunningTime="2025-12-01 09:17:06.017934322 +0000 UTC m=+143.286583090" Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.023194 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.023726 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.523701049 +0000 UTC m=+143.792349817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.124288 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.124689 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.624675746 +0000 UTC m=+143.893324514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.225278 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.225741 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.725723495 +0000 UTC m=+143.994372263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.286043 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" podStartSLOduration=124.2860247 podStartE2EDuration="2m4.2860247s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:06.283914153 +0000 UTC m=+143.552562921" watchObservedRunningTime="2025-12-01 09:17:06.2860247 +0000 UTC m=+143.554673468" Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.326763 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.327322 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.827310679 +0000 UTC m=+144.095959447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.428204 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.428814 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:06.928796221 +0000 UTC m=+144.197444989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.452353 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:06 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:06 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:06 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.452612 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.511827 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-265xp" Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.529538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.529930 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.029913242 +0000 UTC m=+144.298562010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.634623 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.634984 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.13496964 +0000 UTC m=+144.403618408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.736113 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.736539 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.236521873 +0000 UTC m=+144.505170641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.843276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.843797 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.343777991 +0000 UTC m=+144.612426759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.908584 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" event={"ID":"dc8d6026-9d55-4b47-8a46-00480cd75fe0","Type":"ContainerStarted","Data":"e72d84d7908f60dff72d4c239fc35349f39ff7a9e1acf2adf923d2f1210c06c2"} Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.908630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" event={"ID":"dc8d6026-9d55-4b47-8a46-00480cd75fe0","Type":"ContainerStarted","Data":"e1d958356f14972bf1e538e45c7a6d1017af197ddb8ed8956cc9ba43744affb8"} Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.933197 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q5npm" event={"ID":"6850b102-3de9-4180-9897-79a45817954a","Type":"ContainerStarted","Data":"475e31d714be16d66d72af9d663ac61165318feac2f08cea0c45c25bb183c9b1"} Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.943282 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" event={"ID":"fbf5149a-5e97-445e-b53d-fc3ef1a0f66b","Type":"ContainerStarted","Data":"2230124de9e6274e31a3a8b7b52b8ec32f73624ec30106aee26c109ff0aa1986"} Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.944299 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:06 crc kubenswrapper[4763]: E1201 09:17:06.945354 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.445342204 +0000 UTC m=+144.713990972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:06 crc kubenswrapper[4763]: I1201 09:17:06.988278 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" event={"ID":"be60db96-5a06-453d-b95e-4637aa61e1f1","Type":"ContainerStarted","Data":"42a1f93f03d91c87b8382bcd0a32dd2252ef98b207e2e91efc607fc5e7e76ffd"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.038982 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6c2mw" podStartSLOduration=125.038963352 podStartE2EDuration="2m5.038963352s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.007984482 +0000 UTC m=+144.276633250" watchObservedRunningTime="2025-12-01 09:17:07.038963352 +0000 UTC m=+144.307612120" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.042624 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q5npm" podStartSLOduration=10.042606531 podStartE2EDuration="10.042606531s" podCreationTimestamp="2025-12-01 09:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.034312626 +0000 UTC m=+144.302961404" watchObservedRunningTime="2025-12-01 09:17:07.042606531 +0000 UTC m=+144.311255299" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.051264 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" event={"ID":"40081104-3347-4d8a-bfe9-04c6f86948be","Type":"ContainerStarted","Data":"d17b255f77cdc9e133de2b5cf58d643f0e784692b1ac19f0f3db21fe77a26716"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.052149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.053349 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.553329491 +0000 UTC m=+144.821978259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.080082 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" event={"ID":"22fc92d5-a686-419c-ae28-a87874c0f06f","Type":"ContainerStarted","Data":"4cea445a7e9ff606b329a003deee3fe66a8eeadd3e1be33bf3492d73ef6c21d7"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.082079 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5m58m" podStartSLOduration=126.082055161 podStartE2EDuration="2m6.082055161s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.081516376 +0000 UTC m=+144.350165144" watchObservedRunningTime="2025-12-01 09:17:07.082055161 +0000 UTC m=+144.350703929" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.137832 4763 generic.go:334] "Generic (PLEG): container finished" podID="c78ac8a1-6273-4efe-9352-82d5aee9e048" containerID="23e322737e30cde7109757e88eaed603a5114689bc616020f8c066a52bfed575" exitCode=0 Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.137920 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" event={"ID":"c78ac8a1-6273-4efe-9352-82d5aee9e048","Type":"ContainerDied","Data":"23e322737e30cde7109757e88eaed603a5114689bc616020f8c066a52bfed575"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.137946 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" event={"ID":"c78ac8a1-6273-4efe-9352-82d5aee9e048","Type":"ContainerStarted","Data":"a4bfc288affceb935607b0a1e97c26b3310705c5d4e37d4eabff61ed1a8087a4"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.143138 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.158572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.159922 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.659906451 +0000 UTC m=+144.928555219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.170757 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" event={"ID":"a48c156a-fbc7-481c-96cc-201992569c1e","Type":"ContainerStarted","Data":"eedb06deccdb6675fe7aa72e53e141d1443b162b8c0ff9db53b23b63647fa554"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.206899 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" event={"ID":"cd44fcb8-f4e7-4ff4-8790-b5a47cdb8ea6","Type":"ContainerStarted","Data":"5d01af217fe0a6f2e357c7a53c2f06ec26e59f4286386e3ded914908ce6086a8"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.214888 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" podStartSLOduration=126.214872081 podStartE2EDuration="2m6.214872081s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.213318439 +0000 UTC m=+144.481967207" watchObservedRunningTime="2025-12-01 09:17:07.214872081 +0000 UTC m=+144.483520849" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.219119 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gsfgd" podStartSLOduration=125.219107126 podStartE2EDuration="2m5.219107126s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.157439834 +0000 UTC m=+144.426088602" watchObservedRunningTime="2025-12-01 09:17:07.219107126 +0000 UTC m=+144.487755894" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.233376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" event={"ID":"440365f2-877d-49bd-89c3-0dc4ad54efaa","Type":"ContainerStarted","Data":"1b38d30c5e9e4a640aad56b10d796b6c8a57d19e73f5fb5d15e2762f971d93e4"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.234276 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.235846 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4pz4m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.235906 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" podUID="440365f2-877d-49bd-89c3-0dc4ad54efaa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.257303 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" event={"ID":"43d53566-a3c2-4f62-80a5-4ffe7e4ed6ea","Type":"ContainerStarted","Data":"ceed4763a1dace6e3f1795985eb8a9f9c87a1d0b843423781b292b524da9cb9c"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.259897 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.261632 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.761613648 +0000 UTC m=+145.030262476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.274501 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:07 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:07 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:07 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.274550 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.288836 4763 generic.go:334] "Generic (PLEG): container finished" podID="34ce1090-925c-45cc-b797-a08ddbe3dd98" containerID="386b20a38dd799fc39d9e66afa072e549e8356a96ece9fe4dace2888470eb5d6" exitCode=0 Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.288927 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" event={"ID":"34ce1090-925c-45cc-b797-a08ddbe3dd98","Type":"ContainerDied","Data":"386b20a38dd799fc39d9e66afa072e549e8356a96ece9fe4dace2888470eb5d6"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.305679 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-blfn6" podStartSLOduration=125.305661263 podStartE2EDuration="2m5.305661263s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.2502226 +0000 UTC m=+144.518871368" watchObservedRunningTime="2025-12-01 09:17:07.305661263 +0000 UTC m=+144.574310031" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.306928 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zg7dc" podStartSLOduration=126.306922906 podStartE2EDuration="2m6.306922906s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.304557712 +0000 UTC m=+144.573206490" watchObservedRunningTime="2025-12-01 09:17:07.306922906 +0000 UTC m=+144.575571674" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.318345 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" event={"ID":"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9","Type":"ContainerStarted","Data":"845d31ad8cdd13f368710fb9e436440fe3615229753f33f21427604de7be3f3a"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.330884 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sqjc8" event={"ID":"7f3ab311-db3a-4850-a1dc-bd3af2e3f9da","Type":"ContainerStarted","Data":"b33bbfc1160adce3f63cb4c9c97cb00c85c1f0a67687dee2aaf65bb5f804e99e"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.331489 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.332378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" event={"ID":"47303b2b-8915-4828-933b-52f4804bd423","Type":"ContainerStarted","Data":"1e8f07af7ab95205801eb1de9e993a1018b10d8c84c4d3cb3ca29cacb9e8b843"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.351263 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gwr6q" podStartSLOduration=126.351247168 podStartE2EDuration="2m6.351247168s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.349397168 +0000 UTC m=+144.618045936" watchObservedRunningTime="2025-12-01 09:17:07.351247168 +0000 UTC m=+144.619895936" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.364261 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.365555 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" event={"ID":"09f0c6f8-b1c4-4086-9073-2a77ce3a6191","Type":"ContainerStarted","Data":"ca34e3283976150aa061a07205cc722218816e15d807cfa46c80db8e189ee651"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.365593 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" event={"ID":"09f0c6f8-b1c4-4086-9073-2a77ce3a6191","Type":"ContainerStarted","Data":"13e6a468da09b7b601b56430b48d92c255885e421b4722e60c2be24e41281c8f"} Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.366885 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.866874261 +0000 UTC m=+145.135523029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.394555 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" event={"ID":"f6366d07-d82c-4e35-9c94-946426119bde","Type":"ContainerStarted","Data":"659cb6022f359390b64c273b41c67f45f213fbd1380889fd2bd1c7efcf8b7233"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.395378 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.397079 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" event={"ID":"53a538a2-4a8b-4524-aca4-5eff4f91cce5","Type":"ContainerStarted","Data":"89b19e5e8fb7856ee3c74255f6681f74d9b45f278845da0022e2bc9dcb6a111c"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.407110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" event={"ID":"c748802d-9eb3-4f13-80d8-9101979e400e","Type":"ContainerStarted","Data":"c154b2bda1679fcd65523a99c757e63441069844c120d86b1480ff7e75d265a8"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.437583 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" event={"ID":"a67dab1e-ade9-4a36-8e71-8d6fc206d0b5","Type":"ContainerStarted","Data":"fa0c0204e806b3b2225d74682fd26be965abf93822942db2c60fc327bf304715"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.451757 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" event={"ID":"c54cf3d9-0427-43dc-816c-a56ce2c56c83","Type":"ContainerStarted","Data":"808dc76d7b512d5284d1e534737f90f491e2c72ced37986c89444b39f51cfe9a"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.451817 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" event={"ID":"c54cf3d9-0427-43dc-816c-a56ce2c56c83","Type":"ContainerStarted","Data":"9ae7593469593a461f30ee094a37d66b3e9f72b806a1032909e8f8bc99ad3549"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.452740 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-6v56c" podStartSLOduration=126.452722419 podStartE2EDuration="2m6.452722419s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.452413361 +0000 UTC m=+144.721062129" watchObservedRunningTime="2025-12-01 09:17:07.452722419 +0000 UTC m=+144.721371177" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.458501 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.467121 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.468768 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:07.968734523 +0000 UTC m=+145.237383291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.475201 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" event={"ID":"f6329e56-18d1-4479-8699-897fdfdc60fb","Type":"ContainerStarted","Data":"201812e861848c7909ac1bb2454e5cbafc628ba2211d6a80a820b1c37ebf6f40"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.512174 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" event={"ID":"a77fb4c4-4741-4f18-aae2-3aefb20448d0","Type":"ContainerStarted","Data":"cb0391fee75817c886870ceb8c582f8d393779be51fcb6b409a8f89ea0462b37"} Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.514204 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-lk65c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.514238 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lk65c" podUID="f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.536273 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sqjc8" podStartSLOduration=10.536255214 podStartE2EDuration="10.536255214s" podCreationTimestamp="2025-12-01 09:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.500844903 +0000 UTC m=+144.769493671" watchObservedRunningTime="2025-12-01 09:17:07.536255214 +0000 UTC m=+144.804903982" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.537046 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-76x97" podStartSLOduration=126.537041405 podStartE2EDuration="2m6.537041405s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.536670315 +0000 UTC m=+144.805319083" watchObservedRunningTime="2025-12-01 09:17:07.537041405 +0000 UTC m=+144.805690173" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.571393 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.573466 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.073432291 +0000 UTC m=+145.342081129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.671477 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ncq7s" podStartSLOduration=125.671440038 podStartE2EDuration="2m5.671440038s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.594358299 +0000 UTC m=+144.863007067" watchObservedRunningTime="2025-12-01 09:17:07.671440038 +0000 UTC m=+144.940088806" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.672606 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" podStartSLOduration=125.67259684 podStartE2EDuration="2m5.67259684s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.670211896 +0000 UTC m=+144.938860674" watchObservedRunningTime="2025-12-01 09:17:07.67259684 +0000 UTC m=+144.941245608" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.676949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.677530 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.177516083 +0000 UTC m=+145.446164851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.718594 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-l2f2g" podStartSLOduration=125.718574136 podStartE2EDuration="2m5.718574136s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.710411755 +0000 UTC m=+144.979060523" watchObservedRunningTime="2025-12-01 09:17:07.718574136 +0000 UTC m=+144.987222914" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.778416 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6mbc" podStartSLOduration=126.778403528 podStartE2EDuration="2m6.778403528s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.77623214 +0000 UTC m=+145.044880908" watchObservedRunningTime="2025-12-01 09:17:07.778403528 +0000 UTC m=+145.047052296" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.783275 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.783667 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.28365474 +0000 UTC m=+145.552303508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.886895 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.887351 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.387320111 +0000 UTC m=+145.655968879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.917061 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zkwbx" podStartSLOduration=126.917043267 podStartE2EDuration="2m6.917043267s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:07.908938287 +0000 UTC m=+145.177587055" watchObservedRunningTime="2025-12-01 09:17:07.917043267 +0000 UTC m=+145.185692035" Dec 01 09:17:07 crc kubenswrapper[4763]: I1201 09:17:07.987854 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:07 crc kubenswrapper[4763]: E1201 09:17:07.988266 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.488254267 +0000 UTC m=+145.756903035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.088482 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.088570 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.588547576 +0000 UTC m=+145.857196344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.088731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.089037 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.589030599 +0000 UTC m=+145.857679367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.141712 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-g8754" podStartSLOduration=127.141694477 podStartE2EDuration="2m7.141694477s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:08.139006004 +0000 UTC m=+145.407654762" watchObservedRunningTime="2025-12-01 09:17:08.141694477 +0000 UTC m=+145.410343245" Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.190308 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.190511 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.690480019 +0000 UTC m=+145.959128787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.190637 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.191001 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.690984884 +0000 UTC m=+145.959633712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.253123 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:08 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:08 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:08 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.253183 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.292405 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.292603 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.792576628 +0000 UTC m=+146.061225396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.292712 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.292996 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.792988629 +0000 UTC m=+146.061637397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.393623 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.393853 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.893803322 +0000 UTC m=+146.162452090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.394129 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.394837 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.894828579 +0000 UTC m=+146.163477347 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.495430 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.495933 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:08.995911089 +0000 UTC m=+146.264559857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.521150 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" event={"ID":"ead8d5dc-6a02-4fd1-8c68-c137fd26bda9","Type":"ContainerStarted","Data":"804e076b4ce8ceec44daaeac098d0a97b277c432a9705ec237e46cd78a98e4e2"} Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.525116 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" event={"ID":"40081104-3347-4d8a-bfe9-04c6f86948be","Type":"ContainerStarted","Data":"769afa28c949af716ee6b0ef46803da2994b86ebecd5d164afcce9a06405dbe0"} Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.527924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" event={"ID":"a9377ab4-04fa-4a1a-afd3-6ab93a78987e","Type":"ContainerStarted","Data":"b8c12b44035f0dbe790b82144c1487f3adb80496d5d879202ffe2854a169d719"} Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.530434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" event={"ID":"be60db96-5a06-453d-b95e-4637aa61e1f1","Type":"ContainerStarted","Data":"1d7687fdcf7b0994d9cf0138024470d7dfed9b0d8e98f88623eff521b5f3601c"} Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.531207 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4pz4m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.531260 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" podUID="440365f2-877d-49bd-89c3-0dc4ad54efaa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.534111 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.601726 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.602005 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.101994145 +0000 UTC m=+146.370642913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.643419 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5s26f" podStartSLOduration=127.643399618 podStartE2EDuration="2m7.643399618s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:08.277287803 +0000 UTC m=+145.545936571" watchObservedRunningTime="2025-12-01 09:17:08.643399618 +0000 UTC m=+145.912048386" Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.714028 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.714648 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.2146261 +0000 UTC m=+146.483274868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.799149 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-sjmfr" podStartSLOduration=127.79913335 podStartE2EDuration="2m7.79913335s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:08.644324883 +0000 UTC m=+145.912973661" watchObservedRunningTime="2025-12-01 09:17:08.79913335 +0000 UTC m=+146.067782119" Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.816415 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.816987 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.316965614 +0000 UTC m=+146.585614382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.863267 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" podStartSLOduration=127.863248199 podStartE2EDuration="2m7.863248199s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:08.80025349 +0000 UTC m=+146.068902258" watchObservedRunningTime="2025-12-01 09:17:08.863248199 +0000 UTC m=+146.131896967" Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.917480 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.917801 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.417774646 +0000 UTC m=+146.686423414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.917939 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.917990 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:08 crc kubenswrapper[4763]: E1201 09:17:08.918259 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.418247699 +0000 UTC m=+146.686896467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:08 crc kubenswrapper[4763]: I1201 09:17:08.936071 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.019051 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.019219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.019260 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.019289 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.022014 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.022297 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.522263549 +0000 UTC m=+146.790912327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.048542 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.055125 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.121207 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.121783 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.121821 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.621805588 +0000 UTC m=+146.890454356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.223141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.223925 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.723907625 +0000 UTC m=+146.992556393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.252796 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:09 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:09 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:09 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.252858 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.278341 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" podStartSLOduration=127.278320891 podStartE2EDuration="2m7.278320891s" podCreationTimestamp="2025-12-01 09:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:08.866402004 +0000 UTC m=+146.135050772" watchObservedRunningTime="2025-12-01 09:17:09.278320891 +0000 UTC m=+146.546969669" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.280003 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sctxq"] Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.281355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.286283 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.292640 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sctxq"] Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.318737 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.323621 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.328434 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.328829 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.82881758 +0000 UTC m=+147.097466348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.378362 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-lk65c container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.378449 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lk65c" podUID="f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.382812 4763 patch_prober.go:28] interesting pod/downloads-7954f5f757-lk65c container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.382867 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lk65c" podUID="f6d8e4f3-3bb6-419c-ae3a-7f6e34a8ffb4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.427697 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.428791 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.429910 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.430099 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.930082015 +0000 UTC m=+147.198730783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.430201 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-utilities\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.430225 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fws5\" (UniqueName: \"kubernetes.io/projected/580e94d9-c525-4a0a-b965-6aefa59b2b64-kube-api-access-2fws5\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.430271 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.430294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-catalog-content\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.430647 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:09.93063391 +0000 UTC m=+147.199282678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.442169 4763 patch_prober.go:28] interesting pod/console-f9d7485db-458q5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.442231 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-458q5" podUID="f0ccd14f-5d77-4541-860f-d834079cf97f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.446218 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jkphf"] Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.447580 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.450987 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.466181 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jkphf"] Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.507814 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.532969 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.533383 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-catalog-content\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.533519 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-catalog-content\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.533715 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj5g5\" (UniqueName: \"kubernetes.io/projected/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-kube-api-access-pj5g5\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.533846 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-utilities\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.533941 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fws5\" (UniqueName: \"kubernetes.io/projected/580e94d9-c525-4a0a-b965-6aefa59b2b64-kube-api-access-2fws5\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.534050 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-utilities\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.534969 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:10.034953828 +0000 UTC m=+147.303602596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.535929 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-catalog-content\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.536955 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-utilities\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.592530 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fws5\" (UniqueName: \"kubernetes.io/projected/580e94d9-c525-4a0a-b965-6aefa59b2b64-kube-api-access-2fws5\") pod \"certified-operators-sctxq\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.627124 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" event={"ID":"a9377ab4-04fa-4a1a-afd3-6ab93a78987e","Type":"ContainerStarted","Data":"79f57449fea0875822625011a04bfe4b96ae35b36be1768f83e4a4f45bb6d53b"} Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.627357 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" event={"ID":"a9377ab4-04fa-4a1a-afd3-6ab93a78987e","Type":"ContainerStarted","Data":"501de7726c977739eebed2de7d9e1acefbd30eb484fbbbd6384b2c3307ecb76c"} Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.634986 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtj8f\" (UniqueName: \"kubernetes.io/projected/34ce1090-925c-45cc-b797-a08ddbe3dd98-kube-api-access-xtj8f\") pod \"34ce1090-925c-45cc-b797-a08ddbe3dd98\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.635309 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34ce1090-925c-45cc-b797-a08ddbe3dd98-secret-volume\") pod \"34ce1090-925c-45cc-b797-a08ddbe3dd98\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.635606 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34ce1090-925c-45cc-b797-a08ddbe3dd98-config-volume\") pod \"34ce1090-925c-45cc-b797-a08ddbe3dd98\" (UID: \"34ce1090-925c-45cc-b797-a08ddbe3dd98\") " Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.635849 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj5g5\" (UniqueName: \"kubernetes.io/projected/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-kube-api-access-pj5g5\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.635943 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-utilities\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.636058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.636162 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-catalog-content\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.636660 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-catalog-content\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.638203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-utilities\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.638406 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:10.138388622 +0000 UTC m=+147.407037390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.638549 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ce1090-925c-45cc-b797-a08ddbe3dd98-config-volume" (OuterVolumeSpecName: "config-volume") pod "34ce1090-925c-45cc-b797-a08ddbe3dd98" (UID: "34ce1090-925c-45cc-b797-a08ddbe3dd98"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.639916 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.646600 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vdnfj"] Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.646952 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ce1090-925c-45cc-b797-a08ddbe3dd98" containerName="collect-profiles" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.647034 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ce1090-925c-45cc-b797-a08ddbe3dd98" containerName="collect-profiles" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.647177 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ce1090-925c-45cc-b797-a08ddbe3dd98" containerName="collect-profiles" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.647855 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.648503 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ce1090-925c-45cc-b797-a08ddbe3dd98-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "34ce1090-925c-45cc-b797-a08ddbe3dd98" (UID: "34ce1090-925c-45cc-b797-a08ddbe3dd98"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.648754 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ce1090-925c-45cc-b797-a08ddbe3dd98-kube-api-access-xtj8f" (OuterVolumeSpecName: "kube-api-access-xtj8f") pod "34ce1090-925c-45cc-b797-a08ddbe3dd98" (UID: "34ce1090-925c-45cc-b797-a08ddbe3dd98"). InnerVolumeSpecName "kube-api-access-xtj8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.650476 4763 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4pz4m container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.651151 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" podUID="440365f2-877d-49bd-89c3-0dc4ad54efaa" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.649997 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.650015 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv" event={"ID":"34ce1090-925c-45cc-b797-a08ddbe3dd98","Type":"ContainerDied","Data":"603ddd5a449afbd2e8419a37fa23bb1544ca30797a7fbb86e9baefeb5c5b3ef3"} Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.651752 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="603ddd5a449afbd2e8419a37fa23bb1544ca30797a7fbb86e9baefeb5c5b3ef3" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.668970 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vdnfj"] Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.675999 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ft6z9" podStartSLOduration=12.675979602 podStartE2EDuration="12.675979602s" podCreationTimestamp="2025-12-01 09:16:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:09.669549017 +0000 UTC m=+146.938197785" watchObservedRunningTime="2025-12-01 09:17:09.675979602 +0000 UTC m=+146.944628370" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.679923 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj5g5\" (UniqueName: \"kubernetes.io/projected/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-kube-api-access-pj5g5\") pod \"community-operators-jkphf\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.739122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.739299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stbpg\" (UniqueName: \"kubernetes.io/projected/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-kube-api-access-stbpg\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.739392 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-utilities\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.739649 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-catalog-content\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.739704 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtj8f\" (UniqueName: \"kubernetes.io/projected/34ce1090-925c-45cc-b797-a08ddbe3dd98-kube-api-access-xtj8f\") on node \"crc\" DevicePath \"\"" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.739714 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34ce1090-925c-45cc-b797-a08ddbe3dd98-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.739723 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34ce1090-925c-45cc-b797-a08ddbe3dd98-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.740618 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:10.240602013 +0000 UTC m=+147.509250781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.792699 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.842178 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-catalog-content\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.842244 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stbpg\" (UniqueName: \"kubernetes.io/projected/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-kube-api-access-stbpg\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.842275 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-utilities\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.842321 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.842645 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:10.342633479 +0000 UTC m=+147.611282247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.843490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-catalog-content\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.843822 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-utilities\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.844420 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rlrvc"] Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.845341 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.932166 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stbpg\" (UniqueName: \"kubernetes.io/projected/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-kube-api-access-stbpg\") pod \"certified-operators-vdnfj\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.946582 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rlrvc"] Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.947158 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.947427 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-catalog-content\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.947483 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-utilities\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.947517 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95s82\" (UniqueName: \"kubernetes.io/projected/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-kube-api-access-95s82\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:09 crc kubenswrapper[4763]: E1201 09:17:09.947667 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:10.447649697 +0000 UTC m=+147.716298465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:09 crc kubenswrapper[4763]: I1201 09:17:09.972485 4763 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:09.999948 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.050297 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-catalog-content\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.050350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-utilities\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.050379 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95s82\" (UniqueName: \"kubernetes.io/projected/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-kube-api-access-95s82\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.050424 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:10 crc kubenswrapper[4763]: E1201 09:17:10.050788 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-01 09:17:10.550761501 +0000 UTC m=+147.819410269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-v5rbk" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.051302 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-utilities\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.051701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-catalog-content\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.089571 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95s82\" (UniqueName: \"kubernetes.io/projected/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-kube-api-access-95s82\") pod \"community-operators-rlrvc\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.155067 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:10 crc kubenswrapper[4763]: E1201 09:17:10.155413 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-01 09:17:10.655394788 +0000 UTC m=+147.924043556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.173840 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bmp8l" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.204046 4763 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-01T09:17:09.972698816Z","Handler":null,"Name":""} Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.232177 4763 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.232229 4763 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.247735 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.258361 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.301385 4763 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.301429 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.306674 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:10 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:10 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:10 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.306720 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.704595 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b881609c92f267303e3362d0ac56c781344f710066bd5a9023488ab6b9a21e94"} Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.714448 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1aaffce24b9f392173580931b092ab649f09f51f68ec0ddf83995780ddb6609e"} Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.724042 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8073a8afb687ac3749e5e9ccfdf2fa1adc26f86f5e535ebb26c6d2938be1bd08"} Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.746134 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sctxq"] Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.763052 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jkphf"] Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.824780 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-v5rbk\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.884895 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.937605 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.938196 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:17:10 crc kubenswrapper[4763]: I1201 09:17:10.953329 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.003299 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.010618 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.053869 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rlrvc"] Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.071966 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.098958 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vdnfj"] Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.129915 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.129958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.239692 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-df54r"] Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.240998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.244394 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.250266 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.265876 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:11 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:11 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:11 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.266100 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.280574 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-df54r"] Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.296077 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp7cg\" (UniqueName: \"kubernetes.io/projected/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-kube-api-access-qp7cg\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.296156 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-utilities\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.296229 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-catalog-content\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.300472 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.315489 4763 patch_prober.go:28] interesting pod/apiserver-76f77b778f-wf2br container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]log ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]etcd ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/generic-apiserver-start-informers ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/max-in-flight-filter ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 01 09:17:11 crc kubenswrapper[4763]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 01 09:17:11 crc kubenswrapper[4763]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/project.openshift.io-projectcache ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-startinformers ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 01 09:17:11 crc kubenswrapper[4763]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 01 09:17:11 crc kubenswrapper[4763]: livez check failed Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.315558 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" podUID="40081104-3347-4d8a-bfe9-04c6f86948be" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.399118 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-catalog-content\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.399199 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp7cg\" (UniqueName: \"kubernetes.io/projected/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-kube-api-access-qp7cg\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.399317 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-utilities\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.401113 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-utilities\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.405078 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-catalog-content\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.444791 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp7cg\" (UniqueName: \"kubernetes.io/projected/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-kube-api-access-qp7cg\") pod \"redhat-marketplace-df54r\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.577891 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.634163 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dzd9h"] Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.635105 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.651709 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dzd9h"] Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.673733 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5rbk"] Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.703494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-catalog-content\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.703541 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdvcf\" (UniqueName: \"kubernetes.io/projected/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-kube-api-access-wdvcf\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.703581 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-utilities\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.764833 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b333581a954f9c2da1d16170fd0de989c3804582bc32eef5330b535db37a3969"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.770848 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d68e9e2d877bbe6ff2198f0229e222c21b3b64b1f98716a653a8a0c9bdd3e84a"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.771341 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.772399 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" event={"ID":"a41fc022-b655-4924-8b6a-dd3cd87ef9ba","Type":"ContainerStarted","Data":"133cf3bbf28e832226d57ae969b11fcd09de3bd7bf4d61bfe661fae22295fe37"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.787310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f1ad5321f22642ff8dee35d98ee1f0096db30a566775e0653602163e84871bfb"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.795298 4763 generic.go:334] "Generic (PLEG): container finished" podID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerID="c31156d62c816f8c0b506995c444689844f1e16177d14945f50dd9447c0f4222" exitCode=0 Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.795359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vdnfj" event={"ID":"0edc4cd3-ec16-4757-93d5-be9a6272a0a5","Type":"ContainerDied","Data":"c31156d62c816f8c0b506995c444689844f1e16177d14945f50dd9447c0f4222"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.795380 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vdnfj" event={"ID":"0edc4cd3-ec16-4757-93d5-be9a6272a0a5","Type":"ContainerStarted","Data":"34b60406dce0d884f5eca665e8c87c79d92a63967771819c4d2c553bd5be377a"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.802069 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.804353 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-utilities\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.804504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-catalog-content\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.804542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdvcf\" (UniqueName: \"kubernetes.io/projected/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-kube-api-access-wdvcf\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.805896 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-catalog-content\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.806856 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-utilities\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.821447 4763 generic.go:334] "Generic (PLEG): container finished" podID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerID="13dc130819411941112637d9944fb60f0554d01dec17ef65603bb396c2ee8309" exitCode=0 Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.821591 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sctxq" event={"ID":"580e94d9-c525-4a0a-b965-6aefa59b2b64","Type":"ContainerDied","Data":"13dc130819411941112637d9944fb60f0554d01dec17ef65603bb396c2ee8309"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.821667 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sctxq" event={"ID":"580e94d9-c525-4a0a-b965-6aefa59b2b64","Type":"ContainerStarted","Data":"8c4bbd0f5e63521c35475cebbc8fafee22623f088663ec4f635c0ab5e890d7b0"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.848390 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdvcf\" (UniqueName: \"kubernetes.io/projected/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-kube-api-access-wdvcf\") pod \"redhat-marketplace-dzd9h\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.853406 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerID="d018413111e40f7d6a9d99c251e7fec5fb8bf03a13daace19d25102e04076773" exitCode=0 Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.853506 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jkphf" event={"ID":"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18","Type":"ContainerDied","Data":"d018413111e40f7d6a9d99c251e7fec5fb8bf03a13daace19d25102e04076773"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.853533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jkphf" event={"ID":"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18","Type":"ContainerStarted","Data":"d2fa62713016b86344bcd21ce87a1d4ea25bfd9c823cfad01532ff6d51e503ee"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.862392 4763 generic.go:334] "Generic (PLEG): container finished" podID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerID="e0a496b7c78a602338c28e06d47413194b967b3a36c79e195c797d101b868018" exitCode=0 Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.862467 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlrvc" event={"ID":"7eee7d18-22c2-4cd4-aa75-01c94eb4423d","Type":"ContainerDied","Data":"e0a496b7c78a602338c28e06d47413194b967b3a36c79e195c797d101b868018"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.862495 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlrvc" event={"ID":"7eee7d18-22c2-4cd4-aa75-01c94eb4423d","Type":"ContainerStarted","Data":"c2c13d88ab75eee6010b8630df6bb1cc2c0c3b6aa3889364b06ffc9564a85b82"} Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.882152 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kv4n2" Dec 01 09:17:11 crc kubenswrapper[4763]: I1201 09:17:11.949835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.159513 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-df54r"] Dec 01 09:17:12 crc kubenswrapper[4763]: W1201 09:17:12.180906 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa7c3a1_91cb_460a_a74e_3027d72cdfcb.slice/crio-81df243cd5abcf7c4deac3e39858b2bcb957d8c8a6f5f709eb22e4c6f0c5370a WatchSource:0}: Error finding container 81df243cd5abcf7c4deac3e39858b2bcb957d8c8a6f5f709eb22e4c6f0c5370a: Status 404 returned error can't find the container with id 81df243cd5abcf7c4deac3e39858b2bcb957d8c8a6f5f709eb22e4c6f0c5370a Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.263678 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:12 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:12 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:12 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.263999 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.453792 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dzd9h"] Dec 01 09:17:12 crc kubenswrapper[4763]: W1201 09:17:12.471916 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f902258_1cf0_4e18_a155_b43ca9cd2cc4.slice/crio-4ff1709728fc6fc383f2a497234e20857eb0dca1c5c40e814893543aea860875 WatchSource:0}: Error finding container 4ff1709728fc6fc383f2a497234e20857eb0dca1c5c40e814893543aea860875: Status 404 returned error can't find the container with id 4ff1709728fc6fc383f2a497234e20857eb0dca1c5c40e814893543aea860875 Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.636108 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-42cb5"] Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.637379 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.639650 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.653544 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-42cb5"] Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.730588 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wr7t\" (UniqueName: \"kubernetes.io/projected/91386cf5-c3df-4e87-be1a-14989dee67f9-kube-api-access-9wr7t\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.730665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-utilities\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.730703 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-catalog-content\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.832249 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wr7t\" (UniqueName: \"kubernetes.io/projected/91386cf5-c3df-4e87-be1a-14989dee67f9-kube-api-access-9wr7t\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.832354 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-utilities\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.832408 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-catalog-content\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.835308 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-catalog-content\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.836050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-utilities\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.854447 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wr7t\" (UniqueName: \"kubernetes.io/projected/91386cf5-c3df-4e87-be1a-14989dee67f9-kube-api-access-9wr7t\") pod \"redhat-operators-42cb5\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.872669 4763 generic.go:334] "Generic (PLEG): container finished" podID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerID="c399003af904226f6c8e09f611ab672c768c8658da09c3f999f6da3561a1295c" exitCode=0 Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.872753 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-df54r" event={"ID":"caa7c3a1-91cb-460a-a74e-3027d72cdfcb","Type":"ContainerDied","Data":"c399003af904226f6c8e09f611ab672c768c8658da09c3f999f6da3561a1295c"} Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.872784 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-df54r" event={"ID":"caa7c3a1-91cb-460a-a74e-3027d72cdfcb","Type":"ContainerStarted","Data":"81df243cd5abcf7c4deac3e39858b2bcb957d8c8a6f5f709eb22e4c6f0c5370a"} Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.890250 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerID="e4672137a7205aebca6cce3ba91a6f003116d5f435c386d3402a888fb500cdad" exitCode=0 Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.890360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dzd9h" event={"ID":"4f902258-1cf0-4e18-a155-b43ca9cd2cc4","Type":"ContainerDied","Data":"e4672137a7205aebca6cce3ba91a6f003116d5f435c386d3402a888fb500cdad"} Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.890397 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dzd9h" event={"ID":"4f902258-1cf0-4e18-a155-b43ca9cd2cc4","Type":"ContainerStarted","Data":"4ff1709728fc6fc383f2a497234e20857eb0dca1c5c40e814893543aea860875"} Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.914123 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" event={"ID":"a41fc022-b655-4924-8b6a-dd3cd87ef9ba","Type":"ContainerStarted","Data":"59d35de8991a1e7d17b821d9c7f00f3b17921c43666c1735b1eb24f77d9f05e3"} Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.915542 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.964432 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:17:12 crc kubenswrapper[4763]: I1201 09:17:12.964484 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" podStartSLOduration=131.964467842 podStartE2EDuration="2m11.964467842s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:12.962128649 +0000 UTC m=+150.230777437" watchObservedRunningTime="2025-12-01 09:17:12.964467842 +0000 UTC m=+150.233116610" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.044679 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cn4xw"] Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.050134 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.058321 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn4xw"] Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.091404 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sqjc8" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.145984 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bfqv\" (UniqueName: \"kubernetes.io/projected/c9759bf4-0f7f-459d-b393-59c047d7a4d9-kube-api-access-8bfqv\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.146052 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-catalog-content\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.146173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-utilities\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.246887 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bfqv\" (UniqueName: \"kubernetes.io/projected/c9759bf4-0f7f-459d-b393-59c047d7a4d9-kube-api-access-8bfqv\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.246924 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-catalog-content\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.246989 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-utilities\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.247438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-utilities\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.247926 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-catalog-content\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.259854 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:13 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:13 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:13 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.259921 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.283436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bfqv\" (UniqueName: \"kubernetes.io/projected/c9759bf4-0f7f-459d-b393-59c047d7a4d9-kube-api-access-8bfqv\") pod \"redhat-operators-cn4xw\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.383936 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.457170 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-42cb5"] Dec 01 09:17:13 crc kubenswrapper[4763]: W1201 09:17:13.467258 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91386cf5_c3df_4e87_be1a_14989dee67f9.slice/crio-cf228e5378761ac938f9c99a1614061a0904b12765565319608630af82ce1337 WatchSource:0}: Error finding container cf228e5378761ac938f9c99a1614061a0904b12765565319608630af82ce1337: Status 404 returned error can't find the container with id cf228e5378761ac938f9c99a1614061a0904b12765565319608630af82ce1337 Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.528648 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.529677 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.536277 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.536902 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.544214 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.656670 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.656766 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.757741 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.757844 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.757938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.776002 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.928147 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.983395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42cb5" event={"ID":"91386cf5-c3df-4e87-be1a-14989dee67f9","Type":"ContainerStarted","Data":"cf228e5378761ac938f9c99a1614061a0904b12765565319608630af82ce1337"} Dec 01 09:17:13 crc kubenswrapper[4763]: I1201 09:17:13.998318 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cn4xw"] Dec 01 09:17:14 crc kubenswrapper[4763]: W1201 09:17:14.087280 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9759bf4_0f7f_459d_b393_59c047d7a4d9.slice/crio-aeafac82980205e80991ff9f6ea94ca357adb80254a03d4c1598bf6b8b5ed8d5 WatchSource:0}: Error finding container aeafac82980205e80991ff9f6ea94ca357adb80254a03d4c1598bf6b8b5ed8d5: Status 404 returned error can't find the container with id aeafac82980205e80991ff9f6ea94ca357adb80254a03d4c1598bf6b8b5ed8d5 Dec 01 09:17:14 crc kubenswrapper[4763]: I1201 09:17:14.249211 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 01 09:17:14 crc kubenswrapper[4763]: I1201 09:17:14.255079 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:14 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:14 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:14 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:14 crc kubenswrapper[4763]: I1201 09:17:14.255145 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:14 crc kubenswrapper[4763]: W1201 09:17:14.270507 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3d78cb79_2cc0_43ea_8f95_15b6d1ef048d.slice/crio-df1c9425944a3eaa3c2987679b8c52d764fd5853e85bad1b5e8b7ae3d3a38ce1 WatchSource:0}: Error finding container df1c9425944a3eaa3c2987679b8c52d764fd5853e85bad1b5e8b7ae3d3a38ce1: Status 404 returned error can't find the container with id df1c9425944a3eaa3c2987679b8c52d764fd5853e85bad1b5e8b7ae3d3a38ce1 Dec 01 09:17:14 crc kubenswrapper[4763]: I1201 09:17:14.999390 4763 generic.go:334] "Generic (PLEG): container finished" podID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerID="de9eee2c1fdb0d933e79765e710f57688f7bb089c76485523e74a2c506ede97a" exitCode=0 Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.005198 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42cb5" event={"ID":"91386cf5-c3df-4e87-be1a-14989dee67f9","Type":"ContainerDied","Data":"de9eee2c1fdb0d933e79765e710f57688f7bb089c76485523e74a2c506ede97a"} Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.006664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d","Type":"ContainerStarted","Data":"c57c25355f9c52f3a0755f7d54a0d35cd258657f0be5572307296e3e4f08d217"} Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.006682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d","Type":"ContainerStarted","Data":"df1c9425944a3eaa3c2987679b8c52d764fd5853e85bad1b5e8b7ae3d3a38ce1"} Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.012865 4763 generic.go:334] "Generic (PLEG): container finished" podID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerID="157acdd9ff1f37b40b792602dab2e2ada33d4a30bb7acd3c7460a659f21e7c5b" exitCode=0 Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.013551 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn4xw" event={"ID":"c9759bf4-0f7f-459d-b393-59c047d7a4d9","Type":"ContainerDied","Data":"157acdd9ff1f37b40b792602dab2e2ada33d4a30bb7acd3c7460a659f21e7c5b"} Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.013581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn4xw" event={"ID":"c9759bf4-0f7f-459d-b393-59c047d7a4d9","Type":"ContainerStarted","Data":"aeafac82980205e80991ff9f6ea94ca357adb80254a03d4c1598bf6b8b5ed8d5"} Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.053182 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.053165706 podStartE2EDuration="2.053165706s" podCreationTimestamp="2025-12-01 09:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:15.052816477 +0000 UTC m=+152.321465245" watchObservedRunningTime="2025-12-01 09:17:15.053165706 +0000 UTC m=+152.321814474" Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.278964 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:15 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:15 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:15 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:15 crc kubenswrapper[4763]: I1201 09:17:15.279044 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:16 crc kubenswrapper[4763]: I1201 09:17:16.133668 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:16 crc kubenswrapper[4763]: I1201 09:17:16.139484 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wf2br" Dec 01 09:17:16 crc kubenswrapper[4763]: I1201 09:17:16.255687 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:16 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:16 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:16 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:16 crc kubenswrapper[4763]: I1201 09:17:16.255755 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.034695 4763 generic.go:334] "Generic (PLEG): container finished" podID="3d78cb79-2cc0-43ea-8f95-15b6d1ef048d" containerID="c57c25355f9c52f3a0755f7d54a0d35cd258657f0be5572307296e3e4f08d217" exitCode=0 Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.034795 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d","Type":"ContainerDied","Data":"c57c25355f9c52f3a0755f7d54a0d35cd258657f0be5572307296e3e4f08d217"} Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.257332 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:17 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:17 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:17 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.257400 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.982332 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.983274 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.987161 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.988398 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 09:17:17 crc kubenswrapper[4763]: I1201 09:17:17.991076 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.064489 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.064544 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.165631 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.165715 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.165738 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.186161 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.254348 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:18 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:18 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:18 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.254402 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.309148 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.485573 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.572807 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kubelet-dir\") pod \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\" (UID: \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\") " Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.572916 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kube-api-access\") pod \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\" (UID: \"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d\") " Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.573338 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3d78cb79-2cc0-43ea-8f95-15b6d1ef048d" (UID: "3d78cb79-2cc0-43ea-8f95-15b6d1ef048d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.577763 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3d78cb79-2cc0-43ea-8f95-15b6d1ef048d" (UID: "3d78cb79-2cc0-43ea-8f95-15b6d1ef048d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.677079 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.677418 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d78cb79-2cc0-43ea-8f95-15b6d1ef048d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:17:18 crc kubenswrapper[4763]: I1201 09:17:18.951827 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 01 09:17:18 crc kubenswrapper[4763]: W1201 09:17:18.979557 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd50543d2_9343_4942_b2a9_ac4736d7bd8b.slice/crio-a201491c77b2651fbe82970f5aeee25b9b763fba025ccaf7acad36d9c67fba55 WatchSource:0}: Error finding container a201491c77b2651fbe82970f5aeee25b9b763fba025ccaf7acad36d9c67fba55: Status 404 returned error can't find the container with id a201491c77b2651fbe82970f5aeee25b9b763fba025ccaf7acad36d9c67fba55 Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.091275 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.091233 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3d78cb79-2cc0-43ea-8f95-15b6d1ef048d","Type":"ContainerDied","Data":"df1c9425944a3eaa3c2987679b8c52d764fd5853e85bad1b5e8b7ae3d3a38ce1"} Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.091783 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df1c9425944a3eaa3c2987679b8c52d764fd5853e85bad1b5e8b7ae3d3a38ce1" Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.095178 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d50543d2-9343-4942-b2a9-ac4736d7bd8b","Type":"ContainerStarted","Data":"a201491c77b2651fbe82970f5aeee25b9b763fba025ccaf7acad36d9c67fba55"} Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.253291 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:19 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:19 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:19 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.253354 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.391935 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lk65c" Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.442092 4763 patch_prober.go:28] interesting pod/console-f9d7485db-458q5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 01 09:17:19 crc kubenswrapper[4763]: I1201 09:17:19.442150 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-458q5" podUID="f0ccd14f-5d77-4541-860f-d834079cf97f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 01 09:17:20 crc kubenswrapper[4763]: I1201 09:17:20.259388 4763 patch_prober.go:28] interesting pod/router-default-5444994796-z5zrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 01 09:17:20 crc kubenswrapper[4763]: [-]has-synced failed: reason withheld Dec 01 09:17:20 crc kubenswrapper[4763]: [+]process-running ok Dec 01 09:17:20 crc kubenswrapper[4763]: healthz check failed Dec 01 09:17:20 crc kubenswrapper[4763]: I1201 09:17:20.259720 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z5zrr" podUID="81f45a90-202d-4042-96d3-1b24683fc0b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 09:17:21 crc kubenswrapper[4763]: I1201 09:17:21.131815 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d50543d2-9343-4942-b2a9-ac4736d7bd8b","Type":"ContainerStarted","Data":"fd33803c4f44f6402155be9e94071c434a550c71f235880772211359a55780a8"} Dec 01 09:17:21 crc kubenswrapper[4763]: I1201 09:17:21.156691 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.156668381 podStartE2EDuration="4.156668381s" podCreationTimestamp="2025-12-01 09:17:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:17:21.14848583 +0000 UTC m=+158.417134618" watchObservedRunningTime="2025-12-01 09:17:21.156668381 +0000 UTC m=+158.425317149" Dec 01 09:17:21 crc kubenswrapper[4763]: I1201 09:17:21.252133 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:21 crc kubenswrapper[4763]: I1201 09:17:21.256693 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-z5zrr" Dec 01 09:17:22 crc kubenswrapper[4763]: I1201 09:17:22.156242 4763 generic.go:334] "Generic (PLEG): container finished" podID="d50543d2-9343-4942-b2a9-ac4736d7bd8b" containerID="fd33803c4f44f6402155be9e94071c434a550c71f235880772211359a55780a8" exitCode=0 Dec 01 09:17:22 crc kubenswrapper[4763]: I1201 09:17:22.156314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d50543d2-9343-4942-b2a9-ac4736d7bd8b","Type":"ContainerDied","Data":"fd33803c4f44f6402155be9e94071c434a550c71f235880772211359a55780a8"} Dec 01 09:17:24 crc kubenswrapper[4763]: I1201 09:17:24.162436 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:17:24 crc kubenswrapper[4763]: I1201 09:17:24.185352 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db50acd1-5694-49bc-9027-e96f7612e795-metrics-certs\") pod \"network-metrics-daemon-rtkzb\" (UID: \"db50acd1-5694-49bc-9027-e96f7612e795\") " pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:17:24 crc kubenswrapper[4763]: I1201 09:17:24.434079 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rtkzb" Dec 01 09:17:29 crc kubenswrapper[4763]: I1201 09:17:29.430274 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:17:29 crc kubenswrapper[4763]: I1201 09:17:29.440264 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:17:31 crc kubenswrapper[4763]: I1201 09:17:31.082980 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:17:33 crc kubenswrapper[4763]: I1201 09:17:33.930106 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:17:33 crc kubenswrapper[4763]: I1201 09:17:33.930802 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:17:34 crc kubenswrapper[4763]: I1201 09:17:34.934125 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.093960 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kubelet-dir\") pod \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\" (UID: \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\") " Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.094026 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kube-api-access\") pod \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\" (UID: \"d50543d2-9343-4942-b2a9-ac4736d7bd8b\") " Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.094101 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d50543d2-9343-4942-b2a9-ac4736d7bd8b" (UID: "d50543d2-9343-4942-b2a9-ac4736d7bd8b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.094334 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.100152 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d50543d2-9343-4942-b2a9-ac4736d7bd8b" (UID: "d50543d2-9343-4942-b2a9-ac4736d7bd8b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.195849 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d50543d2-9343-4942-b2a9-ac4736d7bd8b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.280722 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.280713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d50543d2-9343-4942-b2a9-ac4736d7bd8b","Type":"ContainerDied","Data":"a201491c77b2651fbe82970f5aeee25b9b763fba025ccaf7acad36d9c67fba55"} Dec 01 09:17:35 crc kubenswrapper[4763]: I1201 09:17:35.280965 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a201491c77b2651fbe82970f5aeee25b9b763fba025ccaf7acad36d9c67fba55" Dec 01 09:17:41 crc kubenswrapper[4763]: I1201 09:17:41.327226 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-t9rl9" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.327395 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.367243 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 09:17:49 crc kubenswrapper[4763]: E1201 09:17:49.367448 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50543d2-9343-4942-b2a9-ac4736d7bd8b" containerName="pruner" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.367484 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50543d2-9343-4942-b2a9-ac4736d7bd8b" containerName="pruner" Dec 01 09:17:49 crc kubenswrapper[4763]: E1201 09:17:49.367496 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d78cb79-2cc0-43ea-8f95-15b6d1ef048d" containerName="pruner" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.367503 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d78cb79-2cc0-43ea-8f95-15b6d1ef048d" containerName="pruner" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.367648 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d78cb79-2cc0-43ea-8f95-15b6d1ef048d" containerName="pruner" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.367663 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50543d2-9343-4942-b2a9-ac4736d7bd8b" containerName="pruner" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.368029 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.370376 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.370569 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.380993 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.535560 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5567adb-2074-432a-ba69-93b2ea007cf2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d5567adb-2074-432a-ba69-93b2ea007cf2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.535619 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5567adb-2074-432a-ba69-93b2ea007cf2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d5567adb-2074-432a-ba69-93b2ea007cf2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.636858 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5567adb-2074-432a-ba69-93b2ea007cf2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d5567adb-2074-432a-ba69-93b2ea007cf2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.637003 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5567adb-2074-432a-ba69-93b2ea007cf2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d5567adb-2074-432a-ba69-93b2ea007cf2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.637084 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5567adb-2074-432a-ba69-93b2ea007cf2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d5567adb-2074-432a-ba69-93b2ea007cf2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.667257 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5567adb-2074-432a-ba69-93b2ea007cf2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d5567adb-2074-432a-ba69-93b2ea007cf2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:17:49 crc kubenswrapper[4763]: I1201 09:17:49.693248 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:17:51 crc kubenswrapper[4763]: E1201 09:17:51.821297 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 09:17:51 crc kubenswrapper[4763]: E1201 09:17:51.822075 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fws5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sctxq_openshift-marketplace(580e94d9-c525-4a0a-b965-6aefa59b2b64): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:17:51 crc kubenswrapper[4763]: E1201 09:17:51.823233 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sctxq" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" Dec 01 09:17:53 crc kubenswrapper[4763]: E1201 09:17:53.103969 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sctxq" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" Dec 01 09:17:53 crc kubenswrapper[4763]: E1201 09:17:53.173612 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 09:17:53 crc kubenswrapper[4763]: E1201 09:17:53.173757 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wdvcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dzd9h_openshift-marketplace(4f902258-1cf0-4e18-a155-b43ca9cd2cc4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:17:53 crc kubenswrapper[4763]: E1201 09:17:53.175079 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dzd9h" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.568757 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.571838 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.579847 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.692528 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.692574 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.692589 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-var-lock\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.793687 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.794066 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-var-lock\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.794175 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.794561 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.794610 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-var-lock\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.827212 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kube-api-access\") pod \"installer-9-crc\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:53 crc kubenswrapper[4763]: I1201 09:17:53.896188 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:17:55 crc kubenswrapper[4763]: E1201 09:17:55.665879 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dzd9h" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" Dec 01 09:17:55 crc kubenswrapper[4763]: E1201 09:17:55.836718 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 09:17:55 crc kubenswrapper[4763]: E1201 09:17:55.837038 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pj5g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jkphf_openshift-marketplace(0ce2b6fa-b131-466e-9ee9-4c4672c9fa18): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:17:55 crc kubenswrapper[4763]: E1201 09:17:55.838577 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jkphf" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" Dec 01 09:17:56 crc kubenswrapper[4763]: E1201 09:17:56.141336 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 01 09:17:56 crc kubenswrapper[4763]: E1201 09:17:56.141545 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-stbpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vdnfj_openshift-marketplace(0edc4cd3-ec16-4757-93d5-be9a6272a0a5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:17:56 crc kubenswrapper[4763]: E1201 09:17:56.142906 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vdnfj" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" Dec 01 09:17:56 crc kubenswrapper[4763]: E1201 09:17:56.257140 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 01 09:17:56 crc kubenswrapper[4763]: E1201 09:17:56.257337 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qp7cg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-df54r_openshift-marketplace(caa7c3a1-91cb-460a-a74e-3027d72cdfcb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:17:56 crc kubenswrapper[4763]: E1201 09:17:56.259650 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-df54r" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" Dec 01 09:17:58 crc kubenswrapper[4763]: I1201 09:17:58.273383 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxfdf"] Dec 01 09:18:01 crc kubenswrapper[4763]: E1201 09:18:01.841623 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-df54r" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" Dec 01 09:18:01 crc kubenswrapper[4763]: E1201 09:18:01.841690 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jkphf" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" Dec 01 09:18:01 crc kubenswrapper[4763]: E1201 09:18:01.841825 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vdnfj" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" Dec 01 09:18:01 crc kubenswrapper[4763]: E1201 09:18:01.859439 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 09:18:01 crc kubenswrapper[4763]: E1201 09:18:01.859981 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bfqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cn4xw_openshift-marketplace(c9759bf4-0f7f-459d-b393-59c047d7a4d9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:18:01 crc kubenswrapper[4763]: E1201 09:18:01.861794 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cn4xw" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" Dec 01 09:18:02 crc kubenswrapper[4763]: I1201 09:18:02.306323 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 01 09:18:02 crc kubenswrapper[4763]: I1201 09:18:02.374995 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rtkzb"] Dec 01 09:18:02 crc kubenswrapper[4763]: I1201 09:18:02.380036 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 01 09:18:02 crc kubenswrapper[4763]: W1201 09:18:02.396706 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb50acd1_5694_49bc_9027_e96f7612e795.slice/crio-ec627c9e00089aa4dd6e528a004d5b4430cf0de9657895ffc41bf94d7d99c795 WatchSource:0}: Error finding container ec627c9e00089aa4dd6e528a004d5b4430cf0de9657895ffc41bf94d7d99c795: Status 404 returned error can't find the container with id ec627c9e00089aa4dd6e528a004d5b4430cf0de9657895ffc41bf94d7d99c795 Dec 01 09:18:02 crc kubenswrapper[4763]: W1201 09:18:02.405100 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd5567adb_2074_432a_ba69_93b2ea007cf2.slice/crio-60a3b7c45281cc9c0f9964c9f33652864ef43aee4f4bfe2a5d28243643bc8002 WatchSource:0}: Error finding container 60a3b7c45281cc9c0f9964c9f33652864ef43aee4f4bfe2a5d28243643bc8002: Status 404 returned error can't find the container with id 60a3b7c45281cc9c0f9964c9f33652864ef43aee4f4bfe2a5d28243643bc8002 Dec 01 09:18:02 crc kubenswrapper[4763]: I1201 09:18:02.450935 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d5567adb-2074-432a-ba69-93b2ea007cf2","Type":"ContainerStarted","Data":"60a3b7c45281cc9c0f9964c9f33652864ef43aee4f4bfe2a5d28243643bc8002"} Dec 01 09:18:02 crc kubenswrapper[4763]: I1201 09:18:02.451980 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" event={"ID":"db50acd1-5694-49bc-9027-e96f7612e795","Type":"ContainerStarted","Data":"ec627c9e00089aa4dd6e528a004d5b4430cf0de9657895ffc41bf94d7d99c795"} Dec 01 09:18:02 crc kubenswrapper[4763]: I1201 09:18:02.453546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b7b34902-05a3-46de-9dc9-4a55e71c6e2a","Type":"ContainerStarted","Data":"49568e126e3d44223151c4f503e7b5887f48c2c78afd78e4e4e988caca4e014a"} Dec 01 09:18:02 crc kubenswrapper[4763]: E1201 09:18:02.548747 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 01 09:18:02 crc kubenswrapper[4763]: E1201 09:18:02.548916 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9wr7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-42cb5_openshift-marketplace(91386cf5-c3df-4e87-be1a-14989dee67f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:18:02 crc kubenswrapper[4763]: E1201 09:18:02.550125 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-42cb5" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" Dec 01 09:18:02 crc kubenswrapper[4763]: E1201 09:18:02.576244 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 09:18:02 crc kubenswrapper[4763]: E1201 09:18:02.576399 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-95s82,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rlrvc_openshift-marketplace(7eee7d18-22c2-4cd4-aa75-01c94eb4423d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:18:02 crc kubenswrapper[4763]: E1201 09:18:02.577641 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rlrvc" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.460056 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b7b34902-05a3-46de-9dc9-4a55e71c6e2a","Type":"ContainerStarted","Data":"a0d2c75f865598a2f19e9e4a98ec01a75fdd9a26a9eebd7ffeb2d10c5c49617c"} Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.462336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d5567adb-2074-432a-ba69-93b2ea007cf2","Type":"ContainerStarted","Data":"bad1bf8ac925ab845d3b23deb86b142187c527513b1c43ebdb52bc2001dfa513"} Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.464620 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" event={"ID":"db50acd1-5694-49bc-9027-e96f7612e795","Type":"ContainerStarted","Data":"ea72c0ed478e20ea26b1721ebb27571446710a4e9e3d82003f90cb7007a1144e"} Dec 01 09:18:03 crc kubenswrapper[4763]: E1201 09:18:03.466746 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rlrvc" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" Dec 01 09:18:03 crc kubenswrapper[4763]: E1201 09:18:03.466745 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-42cb5" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.489646 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=10.489626584 podStartE2EDuration="10.489626584s" podCreationTimestamp="2025-12-01 09:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:18:03.484089752 +0000 UTC m=+200.752738530" watchObservedRunningTime="2025-12-01 09:18:03.489626584 +0000 UTC m=+200.758275352" Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.548574 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=14.548555223 podStartE2EDuration="14.548555223s" podCreationTimestamp="2025-12-01 09:17:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:18:03.547988987 +0000 UTC m=+200.816637755" watchObservedRunningTime="2025-12-01 09:18:03.548555223 +0000 UTC m=+200.817203991" Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.929327 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.929386 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.929432 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.929912 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:18:03 crc kubenswrapper[4763]: I1201 09:18:03.930029 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af" gracePeriod=600 Dec 01 09:18:04 crc kubenswrapper[4763]: I1201 09:18:04.471871 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af" exitCode=0 Dec 01 09:18:04 crc kubenswrapper[4763]: I1201 09:18:04.471917 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af"} Dec 01 09:18:04 crc kubenswrapper[4763]: I1201 09:18:04.472376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"e04da687e5f490cd7c72b41a50dd42a262635c23ccef8396950a220f1c76c07c"} Dec 01 09:18:04 crc kubenswrapper[4763]: I1201 09:18:04.474141 4763 generic.go:334] "Generic (PLEG): container finished" podID="d5567adb-2074-432a-ba69-93b2ea007cf2" containerID="bad1bf8ac925ab845d3b23deb86b142187c527513b1c43ebdb52bc2001dfa513" exitCode=0 Dec 01 09:18:04 crc kubenswrapper[4763]: I1201 09:18:04.474239 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d5567adb-2074-432a-ba69-93b2ea007cf2","Type":"ContainerDied","Data":"bad1bf8ac925ab845d3b23deb86b142187c527513b1c43ebdb52bc2001dfa513"} Dec 01 09:18:04 crc kubenswrapper[4763]: I1201 09:18:04.476594 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rtkzb" event={"ID":"db50acd1-5694-49bc-9027-e96f7612e795","Type":"ContainerStarted","Data":"738a446e6aac8be1965e2c98231b4143ccbd0c340b295c7a3f34be3e8e0c4ae0"} Dec 01 09:18:04 crc kubenswrapper[4763]: I1201 09:18:04.530822 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rtkzb" podStartSLOduration=183.530801739 podStartE2EDuration="3m3.530801739s" podCreationTimestamp="2025-12-01 09:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:18:04.527106708 +0000 UTC m=+201.795755496" watchObservedRunningTime="2025-12-01 09:18:04.530801739 +0000 UTC m=+201.799450497" Dec 01 09:18:05 crc kubenswrapper[4763]: I1201 09:18:05.708304 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:18:05 crc kubenswrapper[4763]: I1201 09:18:05.851298 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5567adb-2074-432a-ba69-93b2ea007cf2-kubelet-dir\") pod \"d5567adb-2074-432a-ba69-93b2ea007cf2\" (UID: \"d5567adb-2074-432a-ba69-93b2ea007cf2\") " Dec 01 09:18:05 crc kubenswrapper[4763]: I1201 09:18:05.851362 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5567adb-2074-432a-ba69-93b2ea007cf2-kube-api-access\") pod \"d5567adb-2074-432a-ba69-93b2ea007cf2\" (UID: \"d5567adb-2074-432a-ba69-93b2ea007cf2\") " Dec 01 09:18:05 crc kubenswrapper[4763]: I1201 09:18:05.852184 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5567adb-2074-432a-ba69-93b2ea007cf2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d5567adb-2074-432a-ba69-93b2ea007cf2" (UID: "d5567adb-2074-432a-ba69-93b2ea007cf2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:18:05 crc kubenswrapper[4763]: I1201 09:18:05.861677 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5567adb-2074-432a-ba69-93b2ea007cf2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d5567adb-2074-432a-ba69-93b2ea007cf2" (UID: "d5567adb-2074-432a-ba69-93b2ea007cf2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:18:05 crc kubenswrapper[4763]: I1201 09:18:05.952314 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5567adb-2074-432a-ba69-93b2ea007cf2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:05 crc kubenswrapper[4763]: I1201 09:18:05.952348 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d5567adb-2074-432a-ba69-93b2ea007cf2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:06 crc kubenswrapper[4763]: I1201 09:18:06.685498 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d5567adb-2074-432a-ba69-93b2ea007cf2","Type":"ContainerDied","Data":"60a3b7c45281cc9c0f9964c9f33652864ef43aee4f4bfe2a5d28243643bc8002"} Dec 01 09:18:06 crc kubenswrapper[4763]: I1201 09:18:06.685789 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a3b7c45281cc9c0f9964c9f33652864ef43aee4f4bfe2a5d28243643bc8002" Dec 01 09:18:06 crc kubenswrapper[4763]: I1201 09:18:06.685850 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 01 09:18:14 crc kubenswrapper[4763]: I1201 09:18:14.735911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dzd9h" event={"ID":"4f902258-1cf0-4e18-a155-b43ca9cd2cc4","Type":"ContainerStarted","Data":"701c3534271a83a3ee1daf9d9a642c236f7b1511e5defe374fd6253b37095327"} Dec 01 09:18:14 crc kubenswrapper[4763]: I1201 09:18:14.738731 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sctxq" event={"ID":"580e94d9-c525-4a0a-b965-6aefa59b2b64","Type":"ContainerStarted","Data":"556a1f6a5340318120f9d6ff9961232939919a893f866493f6fed58fec580159"} Dec 01 09:18:14 crc kubenswrapper[4763]: E1201 09:18:14.928368 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f902258_1cf0_4e18_a155_b43ca9cd2cc4.slice/crio-conmon-701c3534271a83a3ee1daf9d9a642c236f7b1511e5defe374fd6253b37095327.scope\": RecentStats: unable to find data in memory cache]" Dec 01 09:18:15 crc kubenswrapper[4763]: I1201 09:18:15.750645 4763 generic.go:334] "Generic (PLEG): container finished" podID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerID="556a1f6a5340318120f9d6ff9961232939919a893f866493f6fed58fec580159" exitCode=0 Dec 01 09:18:15 crc kubenswrapper[4763]: I1201 09:18:15.750942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sctxq" event={"ID":"580e94d9-c525-4a0a-b965-6aefa59b2b64","Type":"ContainerDied","Data":"556a1f6a5340318120f9d6ff9961232939919a893f866493f6fed58fec580159"} Dec 01 09:18:15 crc kubenswrapper[4763]: I1201 09:18:15.759905 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jkphf" event={"ID":"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18","Type":"ContainerStarted","Data":"99f9f4a3eec839e7a8aff5e0804736bb1e267b05f0c56b06ea347223091bfcd5"} Dec 01 09:18:15 crc kubenswrapper[4763]: I1201 09:18:15.772712 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerID="701c3534271a83a3ee1daf9d9a642c236f7b1511e5defe374fd6253b37095327" exitCode=0 Dec 01 09:18:15 crc kubenswrapper[4763]: I1201 09:18:15.772755 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dzd9h" event={"ID":"4f902258-1cf0-4e18-a155-b43ca9cd2cc4","Type":"ContainerDied","Data":"701c3534271a83a3ee1daf9d9a642c236f7b1511e5defe374fd6253b37095327"} Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.780924 4763 generic.go:334] "Generic (PLEG): container finished" podID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerID="3f55fa3fbc1a779a52a9e154a64bdc11e9664ae582ac1b74421067326ecf91e6" exitCode=0 Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.781004 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vdnfj" event={"ID":"0edc4cd3-ec16-4757-93d5-be9a6272a0a5","Type":"ContainerDied","Data":"3f55fa3fbc1a779a52a9e154a64bdc11e9664ae582ac1b74421067326ecf91e6"} Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.784480 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sctxq" event={"ID":"580e94d9-c525-4a0a-b965-6aefa59b2b64","Type":"ContainerStarted","Data":"d8b6065f324c9484b221ae53039a3d2bff0ef86354e28875996f9bb494de3e25"} Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.786735 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerID="99f9f4a3eec839e7a8aff5e0804736bb1e267b05f0c56b06ea347223091bfcd5" exitCode=0 Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.786792 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jkphf" event={"ID":"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18","Type":"ContainerDied","Data":"99f9f4a3eec839e7a8aff5e0804736bb1e267b05f0c56b06ea347223091bfcd5"} Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.791447 4763 generic.go:334] "Generic (PLEG): container finished" podID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerID="e848f56211d097f21bd6009b8ed5e5381442ae88783cbc2fc750ee168ed89154" exitCode=0 Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.791524 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn4xw" event={"ID":"c9759bf4-0f7f-459d-b393-59c047d7a4d9","Type":"ContainerDied","Data":"e848f56211d097f21bd6009b8ed5e5381442ae88783cbc2fc750ee168ed89154"} Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.796139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dzd9h" event={"ID":"4f902258-1cf0-4e18-a155-b43ca9cd2cc4","Type":"ContainerStarted","Data":"7caf8e623f354dce7f5a5659e93a88a89a66f6651e21e6299043e9ef64e12bd1"} Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.850791 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sctxq" podStartSLOduration=3.489784265 podStartE2EDuration="1m7.850762124s" podCreationTimestamp="2025-12-01 09:17:09 +0000 UTC" firstStartedPulling="2025-12-01 09:17:11.827351675 +0000 UTC m=+149.096000443" lastFinishedPulling="2025-12-01 09:18:16.188329534 +0000 UTC m=+213.456978302" observedRunningTime="2025-12-01 09:18:16.827214137 +0000 UTC m=+214.095862915" watchObservedRunningTime="2025-12-01 09:18:16.850762124 +0000 UTC m=+214.119410892" Dec 01 09:18:16 crc kubenswrapper[4763]: I1201 09:18:16.873151 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dzd9h" podStartSLOduration=2.470097718 podStartE2EDuration="1m5.873128169s" podCreationTimestamp="2025-12-01 09:17:11 +0000 UTC" firstStartedPulling="2025-12-01 09:17:12.897958939 +0000 UTC m=+150.166607707" lastFinishedPulling="2025-12-01 09:18:16.30098939 +0000 UTC m=+213.569638158" observedRunningTime="2025-12-01 09:18:16.851904075 +0000 UTC m=+214.120552833" watchObservedRunningTime="2025-12-01 09:18:16.873128169 +0000 UTC m=+214.141776927" Dec 01 09:18:17 crc kubenswrapper[4763]: I1201 09:18:17.802085 4763 generic.go:334] "Generic (PLEG): container finished" podID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerID="4c6ec7d1a606989ef39cb9bd7299209dbb76b639a21a976e69439e72560dbc7a" exitCode=0 Dec 01 09:18:17 crc kubenswrapper[4763]: I1201 09:18:17.803194 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-df54r" event={"ID":"caa7c3a1-91cb-460a-a74e-3027d72cdfcb","Type":"ContainerDied","Data":"4c6ec7d1a606989ef39cb9bd7299209dbb76b639a21a976e69439e72560dbc7a"} Dec 01 09:18:17 crc kubenswrapper[4763]: I1201 09:18:17.810806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlrvc" event={"ID":"7eee7d18-22c2-4cd4-aa75-01c94eb4423d","Type":"ContainerStarted","Data":"e54000f058492a7ab164f7b12aec785034f0b5cbbd4376a3ab22184c26d4f81b"} Dec 01 09:18:18 crc kubenswrapper[4763]: I1201 09:18:18.823913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jkphf" event={"ID":"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18","Type":"ContainerStarted","Data":"4c071ed3eacb58f842705d61e219c04dbf1b652174579c046f9064a3a4431fb0"} Dec 01 09:18:18 crc kubenswrapper[4763]: I1201 09:18:18.828841 4763 generic.go:334] "Generic (PLEG): container finished" podID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerID="e54000f058492a7ab164f7b12aec785034f0b5cbbd4376a3ab22184c26d4f81b" exitCode=0 Dec 01 09:18:18 crc kubenswrapper[4763]: I1201 09:18:18.828911 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlrvc" event={"ID":"7eee7d18-22c2-4cd4-aa75-01c94eb4423d","Type":"ContainerDied","Data":"e54000f058492a7ab164f7b12aec785034f0b5cbbd4376a3ab22184c26d4f81b"} Dec 01 09:18:18 crc kubenswrapper[4763]: I1201 09:18:18.831075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vdnfj" event={"ID":"0edc4cd3-ec16-4757-93d5-be9a6272a0a5","Type":"ContainerStarted","Data":"78daf206850f15304c9211606841596b10daf19c7dd4a1c285dedc83f5fc0011"} Dec 01 09:18:18 crc kubenswrapper[4763]: I1201 09:18:18.869017 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jkphf" podStartSLOduration=4.106371742 podStartE2EDuration="1m9.868998122s" podCreationTimestamp="2025-12-01 09:17:09 +0000 UTC" firstStartedPulling="2025-12-01 09:17:11.859869136 +0000 UTC m=+149.128517904" lastFinishedPulling="2025-12-01 09:18:17.622495506 +0000 UTC m=+214.891144284" observedRunningTime="2025-12-01 09:18:18.860999273 +0000 UTC m=+216.129648041" watchObservedRunningTime="2025-12-01 09:18:18.868998122 +0000 UTC m=+216.137646890" Dec 01 09:18:18 crc kubenswrapper[4763]: I1201 09:18:18.894682 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vdnfj" podStartSLOduration=4.188113817 podStartE2EDuration="1m9.894659977s" podCreationTimestamp="2025-12-01 09:17:09 +0000 UTC" firstStartedPulling="2025-12-01 09:17:11.801706789 +0000 UTC m=+149.070355557" lastFinishedPulling="2025-12-01 09:18:17.508252939 +0000 UTC m=+214.776901717" observedRunningTime="2025-12-01 09:18:18.88603742 +0000 UTC m=+216.154686198" watchObservedRunningTime="2025-12-01 09:18:18.894659977 +0000 UTC m=+216.163308745" Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.640851 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.640895 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.780944 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.793429 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.793515 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.837287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42cb5" event={"ID":"91386cf5-c3df-4e87-be1a-14989dee67f9","Type":"ContainerStarted","Data":"2cb5b794fbde1d1d273fddc3373c92b701a17602c9af0bbf620e81a1f4a47aae"} Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.838724 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlrvc" event={"ID":"7eee7d18-22c2-4cd4-aa75-01c94eb4423d","Type":"ContainerStarted","Data":"b947537df180a1b959450716ac808061c09c9c77fa9ce7772369d247aea973f0"} Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.840218 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-df54r" event={"ID":"caa7c3a1-91cb-460a-a74e-3027d72cdfcb","Type":"ContainerStarted","Data":"f55f0c680669d3e57ed0a0f6054014275a588ed82a0a72c670635e9efc3cf728"} Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.842014 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn4xw" event={"ID":"c9759bf4-0f7f-459d-b393-59c047d7a4d9","Type":"ContainerStarted","Data":"890c107224c885e587847e41465a98d8583c867c0fe852547f925cf9a0d03f6c"} Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.911934 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cn4xw" podStartSLOduration=4.135987327 podStartE2EDuration="1m6.911917346s" podCreationTimestamp="2025-12-01 09:17:13 +0000 UTC" firstStartedPulling="2025-12-01 09:17:16.022664879 +0000 UTC m=+153.291313647" lastFinishedPulling="2025-12-01 09:18:18.798594898 +0000 UTC m=+216.067243666" observedRunningTime="2025-12-01 09:18:19.909314794 +0000 UTC m=+217.177963562" watchObservedRunningTime="2025-12-01 09:18:19.911917346 +0000 UTC m=+217.180566114" Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.913093 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-df54r" podStartSLOduration=2.730976994 podStartE2EDuration="1m8.913088548s" podCreationTimestamp="2025-12-01 09:17:11 +0000 UTC" firstStartedPulling="2025-12-01 09:17:12.877728101 +0000 UTC m=+150.146376869" lastFinishedPulling="2025-12-01 09:18:19.059839655 +0000 UTC m=+216.328488423" observedRunningTime="2025-12-01 09:18:19.89060367 +0000 UTC m=+217.159252438" watchObservedRunningTime="2025-12-01 09:18:19.913088548 +0000 UTC m=+217.181737316" Dec 01 09:18:19 crc kubenswrapper[4763]: I1201 09:18:19.929465 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rlrvc" podStartSLOduration=3.365553016 podStartE2EDuration="1m10.929434047s" podCreationTimestamp="2025-12-01 09:17:09 +0000 UTC" firstStartedPulling="2025-12-01 09:17:11.871153952 +0000 UTC m=+149.139802720" lastFinishedPulling="2025-12-01 09:18:19.435034983 +0000 UTC m=+216.703683751" observedRunningTime="2025-12-01 09:18:19.928267584 +0000 UTC m=+217.196916352" watchObservedRunningTime="2025-12-01 09:18:19.929434047 +0000 UTC m=+217.198082815" Dec 01 09:18:20 crc kubenswrapper[4763]: I1201 09:18:20.001198 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:18:20 crc kubenswrapper[4763]: I1201 09:18:20.001251 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:18:20 crc kubenswrapper[4763]: I1201 09:18:20.057262 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:18:20 crc kubenswrapper[4763]: I1201 09:18:20.249386 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:18:20 crc kubenswrapper[4763]: I1201 09:18:20.249769 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:18:20 crc kubenswrapper[4763]: I1201 09:18:20.842444 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jkphf" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="registry-server" probeResult="failure" output=< Dec 01 09:18:20 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:18:20 crc kubenswrapper[4763]: > Dec 01 09:18:21 crc kubenswrapper[4763]: I1201 09:18:21.287325 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rlrvc" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="registry-server" probeResult="failure" output=< Dec 01 09:18:21 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:18:21 crc kubenswrapper[4763]: > Dec 01 09:18:21 crc kubenswrapper[4763]: I1201 09:18:21.579091 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:18:21 crc kubenswrapper[4763]: I1201 09:18:21.579418 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:18:21 crc kubenswrapper[4763]: I1201 09:18:21.618705 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:18:21 crc kubenswrapper[4763]: I1201 09:18:21.951018 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:18:21 crc kubenswrapper[4763]: I1201 09:18:21.951113 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:18:21 crc kubenswrapper[4763]: I1201 09:18:21.997508 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:18:22 crc kubenswrapper[4763]: I1201 09:18:22.860749 4763 generic.go:334] "Generic (PLEG): container finished" podID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerID="2cb5b794fbde1d1d273fddc3373c92b701a17602c9af0bbf620e81a1f4a47aae" exitCode=0 Dec 01 09:18:22 crc kubenswrapper[4763]: I1201 09:18:22.860861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42cb5" event={"ID":"91386cf5-c3df-4e87-be1a-14989dee67f9","Type":"ContainerDied","Data":"2cb5b794fbde1d1d273fddc3373c92b701a17602c9af0bbf620e81a1f4a47aae"} Dec 01 09:18:22 crc kubenswrapper[4763]: I1201 09:18:22.918607 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:18:23 crc kubenswrapper[4763]: I1201 09:18:23.308748 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" podUID="08a32dd3-b775-4153-a505-99b17e1637b1" containerName="oauth-openshift" containerID="cri-o://5457caff952e4a23d58b648667e2f9f4917f7e55ee076208a69a333323e430f2" gracePeriod=15 Dec 01 09:18:23 crc kubenswrapper[4763]: I1201 09:18:23.384640 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:18:23 crc kubenswrapper[4763]: I1201 09:18:23.384699 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:18:24 crc kubenswrapper[4763]: I1201 09:18:24.027340 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dzd9h"] Dec 01 09:18:24 crc kubenswrapper[4763]: I1201 09:18:24.426530 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cn4xw" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="registry-server" probeResult="failure" output=< Dec 01 09:18:24 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:18:24 crc kubenswrapper[4763]: > Dec 01 09:18:24 crc kubenswrapper[4763]: I1201 09:18:24.872713 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dzd9h" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerName="registry-server" containerID="cri-o://7caf8e623f354dce7f5a5659e93a88a89a66f6651e21e6299043e9ef64e12bd1" gracePeriod=2 Dec 01 09:18:26 crc kubenswrapper[4763]: I1201 09:18:26.885772 4763 generic.go:334] "Generic (PLEG): container finished" podID="08a32dd3-b775-4153-a505-99b17e1637b1" containerID="5457caff952e4a23d58b648667e2f9f4917f7e55ee076208a69a333323e430f2" exitCode=0 Dec 01 09:18:26 crc kubenswrapper[4763]: I1201 09:18:26.885828 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" event={"ID":"08a32dd3-b775-4153-a505-99b17e1637b1","Type":"ContainerDied","Data":"5457caff952e4a23d58b648667e2f9f4917f7e55ee076208a69a333323e430f2"} Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.896137 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" event={"ID":"08a32dd3-b775-4153-a505-99b17e1637b1","Type":"ContainerDied","Data":"40b3107e6a9004e097cf8bcb0fae0068f2df5a4132af76ab6e26f761b919ae07"} Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.896179 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40b3107e6a9004e097cf8bcb0fae0068f2df5a4132af76ab6e26f761b919ae07" Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.898900 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerID="7caf8e623f354dce7f5a5659e93a88a89a66f6651e21e6299043e9ef64e12bd1" exitCode=0 Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.899117 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dzd9h" event={"ID":"4f902258-1cf0-4e18-a155-b43ca9cd2cc4","Type":"ContainerDied","Data":"7caf8e623f354dce7f5a5659e93a88a89a66f6651e21e6299043e9ef64e12bd1"} Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.913044 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.943674 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-745f6bf96d-w7rjw"] Dec 01 09:18:27 crc kubenswrapper[4763]: E1201 09:18:27.943872 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5567adb-2074-432a-ba69-93b2ea007cf2" containerName="pruner" Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.943909 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5567adb-2074-432a-ba69-93b2ea007cf2" containerName="pruner" Dec 01 09:18:27 crc kubenswrapper[4763]: E1201 09:18:27.943929 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a32dd3-b775-4153-a505-99b17e1637b1" containerName="oauth-openshift" Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.943936 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a32dd3-b775-4153-a505-99b17e1637b1" containerName="oauth-openshift" Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.944030 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5567adb-2074-432a-ba69-93b2ea007cf2" containerName="pruner" Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.944040 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a32dd3-b775-4153-a505-99b17e1637b1" containerName="oauth-openshift" Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.944394 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:27 crc kubenswrapper[4763]: I1201 09:18:27.955859 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-745f6bf96d-w7rjw"] Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.083300 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-session\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.083616 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-service-ca\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.083767 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-provider-selection\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.083858 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-serving-cert\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-audit-policies\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084181 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-cliconfig\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084275 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-idp-0-file-data\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084357 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-login\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084446 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-router-certs\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084571 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/08a32dd3-b775-4153-a505-99b17e1637b1-audit-dir\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084665 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-error\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084759 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-ocp-branding-template\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.084909 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-trusted-ca-bundle\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085027 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg695\" (UniqueName: \"kubernetes.io/projected/08a32dd3-b775-4153-a505-99b17e1637b1-kube-api-access-cg695\") pod \"08a32dd3-b775-4153-a505-99b17e1637b1\" (UID: \"08a32dd3-b775-4153-a505-99b17e1637b1\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085272 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-audit-policies\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085390 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085514 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085651 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085784 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085911 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086053 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcgdw\" (UniqueName: \"kubernetes.io/projected/2a447d0a-ec22-4296-acbd-d352d118fc0b-kube-api-access-vcgdw\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086150 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-session\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-error\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086435 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086573 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-login\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086694 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a447d0a-ec22-4296-acbd-d352d118fc0b-audit-dir\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086805 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085038 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08a32dd3-b775-4153-a505-99b17e1637b1-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.085747 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.086341 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.090889 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.091145 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.091554 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.091609 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.091831 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.093062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.094917 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a32dd3-b775-4153-a505-99b17e1637b1-kube-api-access-cg695" (OuterVolumeSpecName: "kube-api-access-cg695") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "kube-api-access-cg695". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.099826 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.101238 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.101583 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "08a32dd3-b775-4153-a505-99b17e1637b1" (UID: "08a32dd3-b775-4153-a505-99b17e1637b1"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188280 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188344 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcgdw\" (UniqueName: \"kubernetes.io/projected/2a447d0a-ec22-4296-acbd-d352d118fc0b-kube-api-access-vcgdw\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188364 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-session\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188403 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-error\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188422 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188447 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-login\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a447d0a-ec22-4296-acbd-d352d118fc0b-audit-dir\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-audit-policies\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188570 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188592 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188617 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188638 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188681 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188691 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg695\" (UniqueName: \"kubernetes.io/projected/08a32dd3-b775-4153-a505-99b17e1637b1-kube-api-access-cg695\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188701 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188710 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188720 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188732 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188742 4763 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188752 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188762 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188771 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188780 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188788 4763 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/08a32dd3-b775-4153-a505-99b17e1637b1-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188797 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.188807 4763 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/08a32dd3-b775-4153-a505-99b17e1637b1-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.189549 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-service-ca\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.189742 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.189927 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a447d0a-ec22-4296-acbd-d352d118fc0b-audit-dir\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.190702 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.191624 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a447d0a-ec22-4296-acbd-d352d118fc0b-audit-policies\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.191999 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-login\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.192694 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-error\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.193852 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-router-certs\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.193907 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.194128 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-session\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.194281 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.195151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.196320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a447d0a-ec22-4296-acbd-d352d118fc0b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.206664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcgdw\" (UniqueName: \"kubernetes.io/projected/2a447d0a-ec22-4296-acbd-d352d118fc0b-kube-api-access-vcgdw\") pod \"oauth-openshift-745f6bf96d-w7rjw\" (UID: \"2a447d0a-ec22-4296-acbd-d352d118fc0b\") " pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.279436 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.686905 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.701358 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-745f6bf96d-w7rjw"] Dec 01 09:18:28 crc kubenswrapper[4763]: W1201 09:18:28.713170 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a447d0a_ec22_4296_acbd_d352d118fc0b.slice/crio-c18782593afc542a744205505dab97920fc1a12c1f5156e689cbc375bd093244 WatchSource:0}: Error finding container c18782593afc542a744205505dab97920fc1a12c1f5156e689cbc375bd093244: Status 404 returned error can't find the container with id c18782593afc542a744205505dab97920fc1a12c1f5156e689cbc375bd093244 Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.795597 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdvcf\" (UniqueName: \"kubernetes.io/projected/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-kube-api-access-wdvcf\") pod \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.796104 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-utilities\") pod \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.796332 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-catalog-content\") pod \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\" (UID: \"4f902258-1cf0-4e18-a155-b43ca9cd2cc4\") " Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.797549 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-utilities" (OuterVolumeSpecName: "utilities") pod "4f902258-1cf0-4e18-a155-b43ca9cd2cc4" (UID: "4f902258-1cf0-4e18-a155-b43ca9cd2cc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.798993 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-kube-api-access-wdvcf" (OuterVolumeSpecName: "kube-api-access-wdvcf") pod "4f902258-1cf0-4e18-a155-b43ca9cd2cc4" (UID: "4f902258-1cf0-4e18-a155-b43ca9cd2cc4"). InnerVolumeSpecName "kube-api-access-wdvcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.813162 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f902258-1cf0-4e18-a155-b43ca9cd2cc4" (UID: "4f902258-1cf0-4e18-a155-b43ca9cd2cc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.897743 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdvcf\" (UniqueName: \"kubernetes.io/projected/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-kube-api-access-wdvcf\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.897774 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.897784 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f902258-1cf0-4e18-a155-b43ca9cd2cc4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.905921 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dzd9h" event={"ID":"4f902258-1cf0-4e18-a155-b43ca9cd2cc4","Type":"ContainerDied","Data":"4ff1709728fc6fc383f2a497234e20857eb0dca1c5c40e814893543aea860875"} Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.905980 4763 scope.go:117] "RemoveContainer" containerID="7caf8e623f354dce7f5a5659e93a88a89a66f6651e21e6299043e9ef64e12bd1" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.905982 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dzd9h" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.907630 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mxfdf" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.907610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" event={"ID":"2a447d0a-ec22-4296-acbd-d352d118fc0b","Type":"ContainerStarted","Data":"c18782593afc542a744205505dab97920fc1a12c1f5156e689cbc375bd093244"} Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.951328 4763 scope.go:117] "RemoveContainer" containerID="701c3534271a83a3ee1daf9d9a642c236f7b1511e5defe374fd6253b37095327" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.957331 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dzd9h"] Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.971891 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dzd9h"] Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.974447 4763 scope.go:117] "RemoveContainer" containerID="e4672137a7205aebca6cce3ba91a6f003116d5f435c386d3402a888fb500cdad" Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.976049 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxfdf"] Dec 01 09:18:28 crc kubenswrapper[4763]: I1201 09:18:28.978819 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mxfdf"] Dec 01 09:18:29 crc kubenswrapper[4763]: I1201 09:18:29.000946 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a32dd3-b775-4153-a505-99b17e1637b1" path="/var/lib/kubelet/pods/08a32dd3-b775-4153-a505-99b17e1637b1/volumes" Dec 01 09:18:29 crc kubenswrapper[4763]: I1201 09:18:29.002517 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" path="/var/lib/kubelet/pods/4f902258-1cf0-4e18-a155-b43ca9cd2cc4/volumes" Dec 01 09:18:29 crc kubenswrapper[4763]: I1201 09:18:29.695189 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:18:29 crc kubenswrapper[4763]: I1201 09:18:29.830666 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:18:29 crc kubenswrapper[4763]: I1201 09:18:29.876351 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:18:30 crc kubenswrapper[4763]: I1201 09:18:30.035314 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:18:30 crc kubenswrapper[4763]: I1201 09:18:30.295294 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:18:30 crc kubenswrapper[4763]: I1201 09:18:30.347207 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:18:30 crc kubenswrapper[4763]: I1201 09:18:30.960209 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" event={"ID":"2a447d0a-ec22-4296-acbd-d352d118fc0b","Type":"ContainerStarted","Data":"8a5c2be77c3b6e3a4adfec139d1660843490aed97f6e2e1145daa0ec1b5f4a94"} Dec 01 09:18:30 crc kubenswrapper[4763]: I1201 09:18:30.960785 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:30 crc kubenswrapper[4763]: I1201 09:18:30.968845 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" Dec 01 09:18:30 crc kubenswrapper[4763]: I1201 09:18:30.988476 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-745f6bf96d-w7rjw" podStartSLOduration=32.988438838 podStartE2EDuration="32.988438838s" podCreationTimestamp="2025-12-01 09:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:18:30.981245311 +0000 UTC m=+228.249894099" watchObservedRunningTime="2025-12-01 09:18:30.988438838 +0000 UTC m=+228.257087606" Dec 01 09:18:31 crc kubenswrapper[4763]: I1201 09:18:31.625202 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:18:31 crc kubenswrapper[4763]: I1201 09:18:31.971009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42cb5" event={"ID":"91386cf5-c3df-4e87-be1a-14989dee67f9","Type":"ContainerStarted","Data":"c1e9b443c8411aeea449c0e7041284cdf641d0c9c900e481e7bb8d17245de75e"} Dec 01 09:18:31 crc kubenswrapper[4763]: I1201 09:18:31.991037 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-42cb5" podStartSLOduration=4.32945186 podStartE2EDuration="1m19.991019443s" podCreationTimestamp="2025-12-01 09:17:12 +0000 UTC" firstStartedPulling="2025-12-01 09:17:15.001319131 +0000 UTC m=+152.269967899" lastFinishedPulling="2025-12-01 09:18:30.662886714 +0000 UTC m=+227.931535482" observedRunningTime="2025-12-01 09:18:31.988068072 +0000 UTC m=+229.256716860" watchObservedRunningTime="2025-12-01 09:18:31.991019443 +0000 UTC m=+229.259668211" Dec 01 09:18:32 crc kubenswrapper[4763]: I1201 09:18:32.026845 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vdnfj"] Dec 01 09:18:32 crc kubenswrapper[4763]: I1201 09:18:32.027078 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vdnfj" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerName="registry-server" containerID="cri-o://78daf206850f15304c9211606841596b10daf19c7dd4a1c285dedc83f5fc0011" gracePeriod=2 Dec 01 09:18:32 crc kubenswrapper[4763]: I1201 09:18:32.965942 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:18:32 crc kubenswrapper[4763]: I1201 09:18:32.966195 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:18:32 crc kubenswrapper[4763]: I1201 09:18:32.979853 4763 generic.go:334] "Generic (PLEG): container finished" podID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerID="78daf206850f15304c9211606841596b10daf19c7dd4a1c285dedc83f5fc0011" exitCode=0 Dec 01 09:18:32 crc kubenswrapper[4763]: I1201 09:18:32.979897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vdnfj" event={"ID":"0edc4cd3-ec16-4757-93d5-be9a6272a0a5","Type":"ContainerDied","Data":"78daf206850f15304c9211606841596b10daf19c7dd4a1c285dedc83f5fc0011"} Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.243948 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.303619 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-catalog-content\") pod \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.304024 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-utilities\") pod \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.304285 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stbpg\" (UniqueName: \"kubernetes.io/projected/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-kube-api-access-stbpg\") pod \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\" (UID: \"0edc4cd3-ec16-4757-93d5-be9a6272a0a5\") " Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.304768 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-utilities" (OuterVolumeSpecName: "utilities") pod "0edc4cd3-ec16-4757-93d5-be9a6272a0a5" (UID: "0edc4cd3-ec16-4757-93d5-be9a6272a0a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.311673 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-kube-api-access-stbpg" (OuterVolumeSpecName: "kube-api-access-stbpg") pod "0edc4cd3-ec16-4757-93d5-be9a6272a0a5" (UID: "0edc4cd3-ec16-4757-93d5-be9a6272a0a5"). InnerVolumeSpecName "kube-api-access-stbpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.349306 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0edc4cd3-ec16-4757-93d5-be9a6272a0a5" (UID: "0edc4cd3-ec16-4757-93d5-be9a6272a0a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.405156 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stbpg\" (UniqueName: \"kubernetes.io/projected/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-kube-api-access-stbpg\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.405206 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.405222 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0edc4cd3-ec16-4757-93d5-be9a6272a0a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.426837 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rlrvc"] Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.427356 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rlrvc" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="registry-server" containerID="cri-o://b947537df180a1b959450716ac808061c09c9c77fa9ce7772369d247aea973f0" gracePeriod=2 Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.434265 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.489284 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.987185 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vdnfj" event={"ID":"0edc4cd3-ec16-4757-93d5-be9a6272a0a5","Type":"ContainerDied","Data":"34b60406dce0d884f5eca665e8c87c79d92a63967771819c4d2c553bd5be377a"} Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.987267 4763 scope.go:117] "RemoveContainer" containerID="78daf206850f15304c9211606841596b10daf19c7dd4a1c285dedc83f5fc0011" Dec 01 09:18:33 crc kubenswrapper[4763]: I1201 09:18:33.987397 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vdnfj" Dec 01 09:18:34 crc kubenswrapper[4763]: I1201 09:18:34.011048 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-42cb5" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="registry-server" probeResult="failure" output=< Dec 01 09:18:34 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:18:34 crc kubenswrapper[4763]: > Dec 01 09:18:34 crc kubenswrapper[4763]: I1201 09:18:34.018658 4763 scope.go:117] "RemoveContainer" containerID="3f55fa3fbc1a779a52a9e154a64bdc11e9664ae582ac1b74421067326ecf91e6" Dec 01 09:18:34 crc kubenswrapper[4763]: I1201 09:18:34.025091 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vdnfj"] Dec 01 09:18:34 crc kubenswrapper[4763]: I1201 09:18:34.028951 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vdnfj"] Dec 01 09:18:34 crc kubenswrapper[4763]: I1201 09:18:34.036359 4763 scope.go:117] "RemoveContainer" containerID="c31156d62c816f8c0b506995c444689844f1e16177d14945f50dd9447c0f4222" Dec 01 09:18:34 crc kubenswrapper[4763]: I1201 09:18:34.995045 4763 generic.go:334] "Generic (PLEG): container finished" podID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerID="b947537df180a1b959450716ac808061c09c9c77fa9ce7772369d247aea973f0" exitCode=0 Dec 01 09:18:34 crc kubenswrapper[4763]: I1201 09:18:34.998609 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" path="/var/lib/kubelet/pods/0edc4cd3-ec16-4757-93d5-be9a6272a0a5/volumes" Dec 01 09:18:34 crc kubenswrapper[4763]: I1201 09:18:34.999118 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlrvc" event={"ID":"7eee7d18-22c2-4cd4-aa75-01c94eb4423d","Type":"ContainerDied","Data":"b947537df180a1b959450716ac808061c09c9c77fa9ce7772369d247aea973f0"} Dec 01 09:18:35 crc kubenswrapper[4763]: I1201 09:18:35.827182 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn4xw"] Dec 01 09:18:35 crc kubenswrapper[4763]: I1201 09:18:35.827735 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cn4xw" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="registry-server" containerID="cri-o://890c107224c885e587847e41465a98d8583c867c0fe852547f925cf9a0d03f6c" gracePeriod=2 Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.230816 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.341158 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-catalog-content\") pod \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.341422 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-utilities\") pod \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.341502 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95s82\" (UniqueName: \"kubernetes.io/projected/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-kube-api-access-95s82\") pod \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\" (UID: \"7eee7d18-22c2-4cd4-aa75-01c94eb4423d\") " Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.342090 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-utilities" (OuterVolumeSpecName: "utilities") pod "7eee7d18-22c2-4cd4-aa75-01c94eb4423d" (UID: "7eee7d18-22c2-4cd4-aa75-01c94eb4423d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.349647 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-kube-api-access-95s82" (OuterVolumeSpecName: "kube-api-access-95s82") pod "7eee7d18-22c2-4cd4-aa75-01c94eb4423d" (UID: "7eee7d18-22c2-4cd4-aa75-01c94eb4423d"). InnerVolumeSpecName "kube-api-access-95s82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.385006 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7eee7d18-22c2-4cd4-aa75-01c94eb4423d" (UID: "7eee7d18-22c2-4cd4-aa75-01c94eb4423d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.442914 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95s82\" (UniqueName: \"kubernetes.io/projected/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-kube-api-access-95s82\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.442949 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:36 crc kubenswrapper[4763]: I1201 09:18:36.442965 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d18-22c2-4cd4-aa75-01c94eb4423d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.007270 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rlrvc" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.007446 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlrvc" event={"ID":"7eee7d18-22c2-4cd4-aa75-01c94eb4423d","Type":"ContainerDied","Data":"c2c13d88ab75eee6010b8630df6bb1cc2c0c3b6aa3889364b06ffc9564a85b82"} Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.007846 4763 scope.go:117] "RemoveContainer" containerID="b947537df180a1b959450716ac808061c09c9c77fa9ce7772369d247aea973f0" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.013693 4763 generic.go:334] "Generic (PLEG): container finished" podID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerID="890c107224c885e587847e41465a98d8583c867c0fe852547f925cf9a0d03f6c" exitCode=0 Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.013736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn4xw" event={"ID":"c9759bf4-0f7f-459d-b393-59c047d7a4d9","Type":"ContainerDied","Data":"890c107224c885e587847e41465a98d8583c867c0fe852547f925cf9a0d03f6c"} Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.029651 4763 scope.go:117] "RemoveContainer" containerID="e54000f058492a7ab164f7b12aec785034f0b5cbbd4376a3ab22184c26d4f81b" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.035017 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rlrvc"] Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.039050 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rlrvc"] Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.050188 4763 scope.go:117] "RemoveContainer" containerID="e0a496b7c78a602338c28e06d47413194b967b3a36c79e195c797d101b868018" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.621553 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.759327 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-utilities\") pod \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.759422 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-catalog-content\") pod \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.759484 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bfqv\" (UniqueName: \"kubernetes.io/projected/c9759bf4-0f7f-459d-b393-59c047d7a4d9-kube-api-access-8bfqv\") pod \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\" (UID: \"c9759bf4-0f7f-459d-b393-59c047d7a4d9\") " Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.768015 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9759bf4-0f7f-459d-b393-59c047d7a4d9-kube-api-access-8bfqv" (OuterVolumeSpecName: "kube-api-access-8bfqv") pod "c9759bf4-0f7f-459d-b393-59c047d7a4d9" (UID: "c9759bf4-0f7f-459d-b393-59c047d7a4d9"). InnerVolumeSpecName "kube-api-access-8bfqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.768470 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-utilities" (OuterVolumeSpecName: "utilities") pod "c9759bf4-0f7f-459d-b393-59c047d7a4d9" (UID: "c9759bf4-0f7f-459d-b393-59c047d7a4d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.860687 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.860735 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bfqv\" (UniqueName: \"kubernetes.io/projected/c9759bf4-0f7f-459d-b393-59c047d7a4d9-kube-api-access-8bfqv\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.873933 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9759bf4-0f7f-459d-b393-59c047d7a4d9" (UID: "c9759bf4-0f7f-459d-b393-59c047d7a4d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:18:37 crc kubenswrapper[4763]: I1201 09:18:37.962017 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9759bf4-0f7f-459d-b393-59c047d7a4d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:38 crc kubenswrapper[4763]: I1201 09:18:38.023707 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cn4xw" event={"ID":"c9759bf4-0f7f-459d-b393-59c047d7a4d9","Type":"ContainerDied","Data":"aeafac82980205e80991ff9f6ea94ca357adb80254a03d4c1598bf6b8b5ed8d5"} Dec 01 09:18:38 crc kubenswrapper[4763]: I1201 09:18:38.023749 4763 scope.go:117] "RemoveContainer" containerID="890c107224c885e587847e41465a98d8583c867c0fe852547f925cf9a0d03f6c" Dec 01 09:18:38 crc kubenswrapper[4763]: I1201 09:18:38.023840 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cn4xw" Dec 01 09:18:38 crc kubenswrapper[4763]: I1201 09:18:38.039287 4763 scope.go:117] "RemoveContainer" containerID="e848f56211d097f21bd6009b8ed5e5381442ae88783cbc2fc750ee168ed89154" Dec 01 09:18:38 crc kubenswrapper[4763]: I1201 09:18:38.048598 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cn4xw"] Dec 01 09:18:38 crc kubenswrapper[4763]: I1201 09:18:38.051894 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cn4xw"] Dec 01 09:18:38 crc kubenswrapper[4763]: I1201 09:18:38.064551 4763 scope.go:117] "RemoveContainer" containerID="157acdd9ff1f37b40b792602dab2e2ada33d4a30bb7acd3c7460a659f21e7c5b" Dec 01 09:18:39 crc kubenswrapper[4763]: I1201 09:18:39.000224 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" path="/var/lib/kubelet/pods/7eee7d18-22c2-4cd4-aa75-01c94eb4423d/volumes" Dec 01 09:18:39 crc kubenswrapper[4763]: I1201 09:18:39.001042 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" path="/var/lib/kubelet/pods/c9759bf4-0f7f-459d-b393-59c047d7a4d9/volumes" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.691847 4763 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.692383 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644" gracePeriod=15 Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.692532 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b" gracePeriod=15 Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.692574 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138" gracePeriod=15 Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.692602 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b" gracePeriod=15 Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.692635 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422" gracePeriod=15 Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.696978 4763 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697328 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697352 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697376 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697386 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697400 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697753 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697783 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697837 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697854 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="extract-content" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697862 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="extract-content" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697873 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerName="extract-content" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697881 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerName="extract-content" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697891 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697898 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697907 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="extract-utilities" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697915 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="extract-utilities" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697929 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="extract-content" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697938 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="extract-content" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697948 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697956 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697966 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697974 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.697985 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.697992 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.698000 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698007 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.698018 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerName="extract-utilities" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698025 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerName="extract-utilities" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.698037 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698045 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.698056 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698075 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.698087 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="extract-utilities" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698095 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="extract-utilities" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.698103 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerName="extract-content" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698220 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerName="extract-content" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.698232 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerName="extract-utilities" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698242 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerName="extract-utilities" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698608 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eee7d18-22c2-4cd4-aa75-01c94eb4423d" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698624 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698632 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0edc4cd3-ec16-4757-93d5-be9a6272a0a5" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698639 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698646 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f902258-1cf0-4e18-a155-b43ca9cd2cc4" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698660 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9759bf4-0f7f-459d-b393-59c047d7a4d9" containerName="registry-server" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698667 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698675 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698681 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.698864 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.703087 4763 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.703876 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.709797 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 01 09:18:40 crc kubenswrapper[4763]: E1201 09:18:40.735612 4763 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.795197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.795276 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.795304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.795327 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.795348 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.795369 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.795409 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.795429 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896464 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896549 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896619 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896622 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896682 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896687 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896708 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896773 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896816 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896825 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:40 crc kubenswrapper[4763]: I1201 09:18:40.896861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.036508 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.040417 4763 generic.go:334] "Generic (PLEG): container finished" podID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" containerID="a0d2c75f865598a2f19e9e4a98ec01a75fdd9a26a9eebd7ffeb2d10c5c49617c" exitCode=0 Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.040500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b7b34902-05a3-46de-9dc9-4a55e71c6e2a","Type":"ContainerDied","Data":"a0d2c75f865598a2f19e9e4a98ec01a75fdd9a26a9eebd7ffeb2d10c5c49617c"} Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.041137 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.042271 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.043544 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.044169 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b" exitCode=0 Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.044193 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138" exitCode=0 Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.044204 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b" exitCode=0 Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.044213 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422" exitCode=2 Dec 01 09:18:41 crc kubenswrapper[4763]: I1201 09:18:41.044261 4763 scope.go:117] "RemoveContainer" containerID="b9a3261904331e6523e6ce4cad27f64b7e52bd42c47a4ec755b0f4d8c597c5a4" Dec 01 09:18:41 crc kubenswrapper[4763]: W1201 09:18:41.063856 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-55ca2239b53762fd86e9d4c64654f1c35af99676f9d12dcec2539dd89a234f70 WatchSource:0}: Error finding container 55ca2239b53762fd86e9d4c64654f1c35af99676f9d12dcec2539dd89a234f70: Status 404 returned error can't find the container with id 55ca2239b53762fd86e9d4c64654f1c35af99676f9d12dcec2539dd89a234f70 Dec 01 09:18:41 crc kubenswrapper[4763]: E1201 09:18:41.069224 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.199:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d0cd110ccd854 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:18:41.06866082 +0000 UTC m=+238.337309588,LastTimestamp:2025-12-01 09:18:41.06866082 +0000 UTC m=+238.337309588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.052273 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.054701 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654"} Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.054759 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"55ca2239b53762fd86e9d4c64654f1c35af99676f9d12dcec2539dd89a234f70"} Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.055375 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:42 crc kubenswrapper[4763]: E1201 09:18:42.055785 4763 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.285535 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.286218 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.415397 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-var-lock\") pod \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.415473 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kubelet-dir\") pod \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.415561 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kube-api-access\") pod \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\" (UID: \"b7b34902-05a3-46de-9dc9-4a55e71c6e2a\") " Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.415564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b7b34902-05a3-46de-9dc9-4a55e71c6e2a" (UID: "b7b34902-05a3-46de-9dc9-4a55e71c6e2a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.415618 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-var-lock" (OuterVolumeSpecName: "var-lock") pod "b7b34902-05a3-46de-9dc9-4a55e71c6e2a" (UID: "b7b34902-05a3-46de-9dc9-4a55e71c6e2a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.415800 4763 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.415812 4763 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.423892 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b7b34902-05a3-46de-9dc9-4a55e71c6e2a" (UID: "b7b34902-05a3-46de-9dc9-4a55e71c6e2a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.517414 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7b34902-05a3-46de-9dc9-4a55e71c6e2a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:42 crc kubenswrapper[4763]: I1201 09:18:42.996976 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.029489 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.030066 4763 status_manager.go:851] "Failed to get status for pod" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" pod="openshift-marketplace/redhat-operators-42cb5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-42cb5\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.031931 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.068990 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.070207 4763 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644" exitCode=0 Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.070300 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f10b628075dc39a8861b445a66444b4311d829a3df2804cc0814de86112a8a" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.072587 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b7b34902-05a3-46de-9dc9-4a55e71c6e2a","Type":"ContainerDied","Data":"49568e126e3d44223151c4f503e7b5887f48c2c78afd78e4e4e988caca4e014a"} Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.072620 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49568e126e3d44223151c4f503e7b5887f48c2c78afd78e4e4e988caca4e014a" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.072946 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.073570 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.075506 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.076214 4763 status_manager.go:851] "Failed to get status for pod" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" pod="openshift-marketplace/redhat-operators-42cb5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-42cb5\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.080010 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.082035 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.083162 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.083765 4763 status_manager.go:851] "Failed to get status for pod" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" pod="openshift-marketplace/redhat-operators-42cb5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-42cb5\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.084239 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.088917 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.090209 4763 status_manager.go:851] "Failed to get status for pod" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" pod="openshift-marketplace/redhat-operators-42cb5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-42cb5\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.090483 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.090867 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.225952 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.225995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.226059 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.226111 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.226140 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.226211 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.226618 4763 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.226635 4763 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:43 crc kubenswrapper[4763]: I1201 09:18:43.226645 4763 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:18:44 crc kubenswrapper[4763]: I1201 09:18:44.077981 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:44 crc kubenswrapper[4763]: I1201 09:18:44.096486 4763 status_manager.go:851] "Failed to get status for pod" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" pod="openshift-marketplace/redhat-operators-42cb5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-42cb5\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:44 crc kubenswrapper[4763]: I1201 09:18:44.096796 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:44 crc kubenswrapper[4763]: I1201 09:18:44.097128 4763 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:45 crc kubenswrapper[4763]: I1201 09:18:45.000559 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 01 09:18:47 crc kubenswrapper[4763]: E1201 09:18:47.588395 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:47 crc kubenswrapper[4763]: E1201 09:18:47.589134 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:47 crc kubenswrapper[4763]: E1201 09:18:47.589393 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:47 crc kubenswrapper[4763]: E1201 09:18:47.589658 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:47 crc kubenswrapper[4763]: E1201 09:18:47.589900 4763 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:47 crc kubenswrapper[4763]: I1201 09:18:47.589933 4763 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 01 09:18:47 crc kubenswrapper[4763]: E1201 09:18:47.590133 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="200ms" Dec 01 09:18:47 crc kubenswrapper[4763]: E1201 09:18:47.790885 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="400ms" Dec 01 09:18:48 crc kubenswrapper[4763]: E1201 09:18:48.033760 4763 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" volumeName="registry-storage" Dec 01 09:18:48 crc kubenswrapper[4763]: E1201 09:18:48.191678 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="800ms" Dec 01 09:18:48 crc kubenswrapper[4763]: E1201 09:18:48.993015 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="1.6s" Dec 01 09:18:49 crc kubenswrapper[4763]: E1201 09:18:49.831589 4763 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.199:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d0cd110ccd854 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-01 09:18:41.06866082 +0000 UTC m=+238.337309588,LastTimestamp:2025-12-01 09:18:41.06866082 +0000 UTC m=+238.337309588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 01 09:18:50 crc kubenswrapper[4763]: E1201 09:18:50.594398 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="3.2s" Dec 01 09:18:51 crc kubenswrapper[4763]: I1201 09:18:51.994008 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:51 crc kubenswrapper[4763]: I1201 09:18:51.995162 4763 status_manager.go:851] "Failed to get status for pod" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" pod="openshift-marketplace/redhat-operators-42cb5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-42cb5\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:51 crc kubenswrapper[4763]: I1201 09:18:51.995703 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:52 crc kubenswrapper[4763]: I1201 09:18:52.009141 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:18:52 crc kubenswrapper[4763]: I1201 09:18:52.009175 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:18:52 crc kubenswrapper[4763]: E1201 09:18:52.009577 4763 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:52 crc kubenswrapper[4763]: I1201 09:18:52.010042 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:52 crc kubenswrapper[4763]: I1201 09:18:52.143384 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a0d9324b98798d4642f428ded158c61cc6ea67fb8157320d27663a9e1aed00d3"} Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.000696 4763 status_manager.go:851] "Failed to get status for pod" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" pod="openshift-marketplace/redhat-operators-42cb5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-42cb5\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.001142 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.001551 4763 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.149104 4763 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2f92bac9cbaf0ab05bd30205566eeff654f5f3de2629832a40af725ead86c4c3" exitCode=0 Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.149144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2f92bac9cbaf0ab05bd30205566eeff654f5f3de2629832a40af725ead86c4c3"} Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.149365 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.149377 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:18:53 crc kubenswrapper[4763]: E1201 09:18:53.149755 4763 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.150055 4763 status_manager.go:851] "Failed to get status for pod" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.150356 4763 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:53 crc kubenswrapper[4763]: I1201 09:18:53.151513 4763 status_manager.go:851] "Failed to get status for pod" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" pod="openshift-marketplace/redhat-operators-42cb5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-42cb5\": dial tcp 38.102.83.199:6443: connect: connection refused" Dec 01 09:18:53 crc kubenswrapper[4763]: E1201 09:18:53.796603 4763 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.199:6443: connect: connection refused" interval="6.4s" Dec 01 09:18:55 crc kubenswrapper[4763]: I1201 09:18:55.162678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"830b026b817d977d6bc4819111cf9fdd85d1db65594706c95577895724ed35b4"} Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.172581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cd7c45b8b5e0b19d68c03e081f56482aa556f19c78b319b97283f0609f2f023e"} Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.172913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"224321fc6c37e90db060a4b42531921fa5496d9a3bd293f6849f02852858a562"} Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.172935 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.172945 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3a707ad6f7e5a2889335a14ecfc8982cc00bd7c945d9cd78fe69732f698072a3"} Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.172953 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f0f86c1e879f43f69a69e71f0bdb4bde57da68a0852fcbb11ac462d7365dfd05"} Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.172819 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.172971 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.175700 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.175747 4763 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3" exitCode=1 Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.175769 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3"} Dec 01 09:18:56 crc kubenswrapper[4763]: I1201 09:18:56.176165 4763 scope.go:117] "RemoveContainer" containerID="93982ebe23ecb4654d8e5df6db5c80fe22efdf7529d7154b872fd5396b8210f3" Dec 01 09:18:57 crc kubenswrapper[4763]: I1201 09:18:57.011171 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:57 crc kubenswrapper[4763]: I1201 09:18:57.011490 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:18:57 crc kubenswrapper[4763]: I1201 09:18:57.017748 4763 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 01 09:18:57 crc kubenswrapper[4763]: I1201 09:18:57.017805 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 01 09:18:57 crc kubenswrapper[4763]: I1201 09:18:57.182975 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 01 09:18:57 crc kubenswrapper[4763]: I1201 09:18:57.183036 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ddc35ec9014e3d0ddb45d50951b25f58ae46c5fe45c878dd1b23f641ab4ab61"} Dec 01 09:18:59 crc kubenswrapper[4763]: I1201 09:18:59.499564 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:19:01 crc kubenswrapper[4763]: I1201 09:19:01.986168 4763 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:19:02 crc kubenswrapper[4763]: I1201 09:19:02.015560 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:19:02 crc kubenswrapper[4763]: I1201 09:19:02.022564 4763 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dc649cf7-3035-426d-bb44-b537c60d44bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:18:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:18:53Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-01T09:18:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://830b026b817d977d6bc4819111cf9fdd85d1db65594706c95577895724ed35b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a707ad6f7e5a2889335a14ecfc8982cc00bd7c945d9cd78fe69732f698072a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0f86c1e879f43f69a69e71f0bdb4bde57da68a0852fcbb11ac462d7365dfd05\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd7c45b8b5e0b19d68c03e081f56482aa556f19c78b319b97283f0609f2f023e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:18:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://224321fc6c37e90db060a4b42531921fa5496d9a3bd293f6849f02852858a562\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-01T09:18:55Z\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f92bac9cbaf0ab05bd30205566eeff654f5f3de2629832a40af725ead86c4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f92bac9cbaf0ab05bd30205566eeff654f5f3de2629832a40af725ead86c4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-01T09:18:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-01T09:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}]}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"dc649cf7-3035-426d-bb44-b537c60d44bf\": field is immutable" Dec 01 09:19:02 crc kubenswrapper[4763]: I1201 09:19:02.067866 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d997a1fa-94ef-47bc-b85a-b5946006a758" Dec 01 09:19:02 crc kubenswrapper[4763]: I1201 09:19:02.208276 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:19:02 crc kubenswrapper[4763]: I1201 09:19:02.208307 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:19:02 crc kubenswrapper[4763]: I1201 09:19:02.211192 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d997a1fa-94ef-47bc-b85a-b5946006a758" Dec 01 09:19:02 crc kubenswrapper[4763]: I1201 09:19:02.214879 4763 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://830b026b817d977d6bc4819111cf9fdd85d1db65594706c95577895724ed35b4" Dec 01 09:19:02 crc kubenswrapper[4763]: I1201 09:19:02.214913 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:19:03 crc kubenswrapper[4763]: I1201 09:19:03.216881 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:19:03 crc kubenswrapper[4763]: I1201 09:19:03.217217 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:19:03 crc kubenswrapper[4763]: I1201 09:19:03.221639 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d997a1fa-94ef-47bc-b85a-b5946006a758" Dec 01 09:19:04 crc kubenswrapper[4763]: I1201 09:19:04.220787 4763 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:19:04 crc kubenswrapper[4763]: I1201 09:19:04.221069 4763 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="dc649cf7-3035-426d-bb44-b537c60d44bf" Dec 01 09:19:04 crc kubenswrapper[4763]: I1201 09:19:04.224162 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d997a1fa-94ef-47bc-b85a-b5946006a758" Dec 01 09:19:05 crc kubenswrapper[4763]: I1201 09:19:05.699800 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:19:05 crc kubenswrapper[4763]: I1201 09:19:05.703793 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:19:06 crc kubenswrapper[4763]: I1201 09:19:06.236714 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 01 09:19:11 crc kubenswrapper[4763]: I1201 09:19:11.907213 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 01 09:19:12 crc kubenswrapper[4763]: I1201 09:19:12.752261 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 01 09:19:12 crc kubenswrapper[4763]: I1201 09:19:12.836675 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 01 09:19:13 crc kubenswrapper[4763]: I1201 09:19:13.594530 4763 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 01 09:19:13 crc kubenswrapper[4763]: I1201 09:19:13.785618 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 09:19:13 crc kubenswrapper[4763]: I1201 09:19:13.879580 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 01 09:19:13 crc kubenswrapper[4763]: I1201 09:19:13.964211 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 01 09:19:13 crc kubenswrapper[4763]: I1201 09:19:13.986233 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.038512 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.183936 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.243514 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.296298 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.390536 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.445967 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.564229 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.591578 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.681595 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.727070 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.783417 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.890714 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 01 09:19:14 crc kubenswrapper[4763]: I1201 09:19:14.979607 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.009946 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.056970 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.147324 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.148864 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.227775 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.404118 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.647548 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.653221 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.675733 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.681766 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.697013 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.780390 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.787803 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.889048 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.924719 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 01 09:19:15 crc kubenswrapper[4763]: I1201 09:19:15.936966 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.119771 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.124781 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.178898 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.286032 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.315162 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.484731 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.497937 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.502177 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.554047 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.572954 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.629954 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 01 09:19:16 crc kubenswrapper[4763]: I1201 09:19:16.682849 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.073660 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.140828 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.165437 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.198141 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.206862 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.276805 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.346008 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.384446 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.393979 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.418798 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.677058 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.703525 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.790709 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 01 09:19:17 crc kubenswrapper[4763]: I1201 09:19:17.877684 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.020336 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.039187 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.068158 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.079083 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.151220 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.231356 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.237586 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.266124 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.306351 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.326878 4763 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.388855 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.514983 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.581638 4763 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.636042 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.641023 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.647687 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.690255 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.704741 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.706964 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.793321 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.807840 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.828310 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 01 09:19:18 crc kubenswrapper[4763]: I1201 09:19:18.829259 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.033908 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.070738 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.093053 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.125040 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.144860 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.154166 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.183821 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.214198 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.235533 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.259489 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.307493 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.373937 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.435753 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.496480 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.509553 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.539695 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.551028 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.743824 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.799222 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.856562 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.866770 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.879263 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 01 09:19:19 crc kubenswrapper[4763]: I1201 09:19:19.904658 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.075033 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.106614 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.175289 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.320249 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.354164 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.361226 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.382159 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.386054 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.541187 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.554428 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.623976 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.639658 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.777089 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.797162 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.849366 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.866237 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 01 09:19:20 crc kubenswrapper[4763]: I1201 09:19:20.920150 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.052247 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.064675 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.114912 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.201433 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.248699 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.302569 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.336675 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.371416 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.389500 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.472043 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.490629 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.529091 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.589343 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.621027 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.684113 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.704336 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.719090 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.764650 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.793006 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.913434 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.939255 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.960635 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.979433 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 01 09:19:21 crc kubenswrapper[4763]: I1201 09:19:21.992224 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.025220 4763 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.053885 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.090629 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.096287 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.098006 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.107639 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.170020 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.194073 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.203959 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.257412 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.294319 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.334371 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.411123 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.461191 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.481610 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.505418 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.616268 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.629642 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.726216 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.823831 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.913372 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 01 09:19:22 crc kubenswrapper[4763]: I1201 09:19:22.956419 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.029055 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.095380 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.230554 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.342546 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.344053 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.432165 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.433548 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.471760 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.509697 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.539552 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.556443 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.587356 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.700036 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.755765 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.939928 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.944295 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 01 09:19:23 crc kubenswrapper[4763]: I1201 09:19:23.978670 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.049183 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.065082 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.066403 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.146390 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.285793 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.298602 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.299136 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.373756 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.396810 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.494876 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.528017 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.590352 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.713144 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.719723 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.833685 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 01 09:19:24 crc kubenswrapper[4763]: I1201 09:19:24.860386 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.069862 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.090253 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.095631 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.198583 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.220939 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.278112 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.471352 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.472520 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.486268 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.644076 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.768527 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.772659 4763 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.800908 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.811822 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 09:19:25 crc kubenswrapper[4763]: I1201 09:19:25.916887 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.052703 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.073446 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.076312 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.094690 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.100141 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.109920 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.140024 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.149737 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.231966 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.352300 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.398324 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.518208 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.599438 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.647147 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.712597 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 01 09:19:26 crc kubenswrapper[4763]: I1201 09:19:26.754185 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:26.807471 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:26.807917 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:26.845312 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.054693 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.096627 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.130262 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.235919 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.348398 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.581913 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.639958 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.671391 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 01 09:19:27 crc kubenswrapper[4763]: I1201 09:19:27.725261 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.101920 4763 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.105876 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.105920 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7mwbs","openshift-kube-apiserver/kube-apiserver-crc"] Dec 01 09:19:28 crc kubenswrapper[4763]: E1201 09:19:28.106101 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" containerName="installer" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.106118 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" containerName="installer" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.106207 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b34902-05a3-46de-9dc9-4a55e71c6e2a" containerName="installer" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.106501 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4pz4m","openshift-marketplace/certified-operators-sctxq","openshift-marketplace/redhat-marketplace-df54r","openshift-marketplace/redhat-operators-42cb5","openshift-marketplace/community-operators-jkphf"] Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.106729 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.106847 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jkphf" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="registry-server" containerID="cri-o://4c071ed3eacb58f842705d61e219c04dbf1b652174579c046f9064a3a4431fb0" gracePeriod=30 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.107023 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-df54r" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerName="registry-server" containerID="cri-o://f55f0c680669d3e57ed0a0f6054014275a588ed82a0a72c670635e9efc3cf728" gracePeriod=30 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.107183 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-42cb5" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="registry-server" containerID="cri-o://c1e9b443c8411aeea449c0e7041284cdf641d0c9c900e481e7bb8d17245de75e" gracePeriod=30 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.107238 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" podUID="440365f2-877d-49bd-89c3-0dc4ad54efaa" containerName="marketplace-operator" containerID="cri-o://1b38d30c5e9e4a640aad56b10d796b6c8a57d19e73f5fb5d15e2762f971d93e4" gracePeriod=30 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.107016 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sctxq" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerName="registry-server" containerID="cri-o://d8b6065f324c9484b221ae53039a3d2bff0ef86354e28875996f9bb494de3e25" gracePeriod=30 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.113485 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.168913 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=27.168892622 podStartE2EDuration="27.168892622s" podCreationTimestamp="2025-12-01 09:19:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:19:28.136496261 +0000 UTC m=+285.405145029" watchObservedRunningTime="2025-12-01 09:19:28.168892622 +0000 UTC m=+285.437541400" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.192073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.192153 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxs62\" (UniqueName: \"kubernetes.io/projected/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-kube-api-access-lxs62\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.192189 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.293193 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.293274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxs62\" (UniqueName: \"kubernetes.io/projected/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-kube-api-access-lxs62\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.293307 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.294318 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.302309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.318055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxs62\" (UniqueName: \"kubernetes.io/projected/8ed98359-8184-409c-9f5d-f2b2b21b9cb7-kube-api-access-lxs62\") pod \"marketplace-operator-79b997595-7mwbs\" (UID: \"8ed98359-8184-409c-9f5d-f2b2b21b9cb7\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.345934 4763 generic.go:334] "Generic (PLEG): container finished" podID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerID="4c071ed3eacb58f842705d61e219c04dbf1b652174579c046f9064a3a4431fb0" exitCode=0 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.346011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jkphf" event={"ID":"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18","Type":"ContainerDied","Data":"4c071ed3eacb58f842705d61e219c04dbf1b652174579c046f9064a3a4431fb0"} Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.354231 4763 generic.go:334] "Generic (PLEG): container finished" podID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerID="f55f0c680669d3e57ed0a0f6054014275a588ed82a0a72c670635e9efc3cf728" exitCode=0 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.354313 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-df54r" event={"ID":"caa7c3a1-91cb-460a-a74e-3027d72cdfcb","Type":"ContainerDied","Data":"f55f0c680669d3e57ed0a0f6054014275a588ed82a0a72c670635e9efc3cf728"} Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.356675 4763 generic.go:334] "Generic (PLEG): container finished" podID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerID="d8b6065f324c9484b221ae53039a3d2bff0ef86354e28875996f9bb494de3e25" exitCode=0 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.356744 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sctxq" event={"ID":"580e94d9-c525-4a0a-b965-6aefa59b2b64","Type":"ContainerDied","Data":"d8b6065f324c9484b221ae53039a3d2bff0ef86354e28875996f9bb494de3e25"} Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.359725 4763 generic.go:334] "Generic (PLEG): container finished" podID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerID="c1e9b443c8411aeea449c0e7041284cdf641d0c9c900e481e7bb8d17245de75e" exitCode=0 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.359810 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42cb5" event={"ID":"91386cf5-c3df-4e87-be1a-14989dee67f9","Type":"ContainerDied","Data":"c1e9b443c8411aeea449c0e7041284cdf641d0c9c900e481e7bb8d17245de75e"} Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.361226 4763 generic.go:334] "Generic (PLEG): container finished" podID="440365f2-877d-49bd-89c3-0dc4ad54efaa" containerID="1b38d30c5e9e4a640aad56b10d796b6c8a57d19e73f5fb5d15e2762f971d93e4" exitCode=0 Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.362136 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" event={"ID":"440365f2-877d-49bd-89c3-0dc4ad54efaa","Type":"ContainerDied","Data":"1b38d30c5e9e4a640aad56b10d796b6c8a57d19e73f5fb5d15e2762f971d93e4"} Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.410286 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.431774 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.536660 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.600140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wr7t\" (UniqueName: \"kubernetes.io/projected/91386cf5-c3df-4e87-be1a-14989dee67f9-kube-api-access-9wr7t\") pod \"91386cf5-c3df-4e87-be1a-14989dee67f9\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.600208 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-catalog-content\") pod \"91386cf5-c3df-4e87-be1a-14989dee67f9\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.600337 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-utilities\") pod \"91386cf5-c3df-4e87-be1a-14989dee67f9\" (UID: \"91386cf5-c3df-4e87-be1a-14989dee67f9\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.603488 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-utilities" (OuterVolumeSpecName: "utilities") pod "91386cf5-c3df-4e87-be1a-14989dee67f9" (UID: "91386cf5-c3df-4e87-be1a-14989dee67f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.609528 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91386cf5-c3df-4e87-be1a-14989dee67f9-kube-api-access-9wr7t" (OuterVolumeSpecName: "kube-api-access-9wr7t") pod "91386cf5-c3df-4e87-be1a-14989dee67f9" (UID: "91386cf5-c3df-4e87-be1a-14989dee67f9"). InnerVolumeSpecName "kube-api-access-9wr7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.625767 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.640153 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.644872 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.668633 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.701955 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-catalog-content\") pod \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.702025 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj5g5\" (UniqueName: \"kubernetes.io/projected/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-kube-api-access-pj5g5\") pod \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.702076 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-catalog-content\") pod \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.702118 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-operator-metrics\") pod \"440365f2-877d-49bd-89c3-0dc4ad54efaa\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.702150 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-trusted-ca\") pod \"440365f2-877d-49bd-89c3-0dc4ad54efaa\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.702177 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp7cg\" (UniqueName: \"kubernetes.io/projected/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-kube-api-access-qp7cg\") pod \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.702204 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mnpv\" (UniqueName: \"kubernetes.io/projected/440365f2-877d-49bd-89c3-0dc4ad54efaa-kube-api-access-2mnpv\") pod \"440365f2-877d-49bd-89c3-0dc4ad54efaa\" (UID: \"440365f2-877d-49bd-89c3-0dc4ad54efaa\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.702243 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-utilities\") pod \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\" (UID: \"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.702298 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-utilities\") pod \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\" (UID: \"caa7c3a1-91cb-460a-a74e-3027d72cdfcb\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.704288 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "440365f2-877d-49bd-89c3-0dc4ad54efaa" (UID: "440365f2-877d-49bd-89c3-0dc4ad54efaa"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.704714 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.704785 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.704800 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wr7t\" (UniqueName: \"kubernetes.io/projected/91386cf5-c3df-4e87-be1a-14989dee67f9-kube-api-access-9wr7t\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.709159 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-utilities" (OuterVolumeSpecName: "utilities") pod "caa7c3a1-91cb-460a-a74e-3027d72cdfcb" (UID: "caa7c3a1-91cb-460a-a74e-3027d72cdfcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.709976 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440365f2-877d-49bd-89c3-0dc4ad54efaa-kube-api-access-2mnpv" (OuterVolumeSpecName: "kube-api-access-2mnpv") pod "440365f2-877d-49bd-89c3-0dc4ad54efaa" (UID: "440365f2-877d-49bd-89c3-0dc4ad54efaa"). InnerVolumeSpecName "kube-api-access-2mnpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.710439 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-utilities" (OuterVolumeSpecName: "utilities") pod "0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" (UID: "0ce2b6fa-b131-466e-9ee9-4c4672c9fa18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.727640 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.727729 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "caa7c3a1-91cb-460a-a74e-3027d72cdfcb" (UID: "caa7c3a1-91cb-460a-a74e-3027d72cdfcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.731602 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-kube-api-access-pj5g5" (OuterVolumeSpecName: "kube-api-access-pj5g5") pod "0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" (UID: "0ce2b6fa-b131-466e-9ee9-4c4672c9fa18"). InnerVolumeSpecName "kube-api-access-pj5g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.732238 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-kube-api-access-qp7cg" (OuterVolumeSpecName: "kube-api-access-qp7cg") pod "caa7c3a1-91cb-460a-a74e-3027d72cdfcb" (UID: "caa7c3a1-91cb-460a-a74e-3027d72cdfcb"). InnerVolumeSpecName "kube-api-access-qp7cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.735096 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "440365f2-877d-49bd-89c3-0dc4ad54efaa" (UID: "440365f2-877d-49bd-89c3-0dc4ad54efaa"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.761045 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91386cf5-c3df-4e87-be1a-14989dee67f9" (UID: "91386cf5-c3df-4e87-be1a-14989dee67f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.764845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" (UID: "0ce2b6fa-b131-466e-9ee9-4c4672c9fa18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806183 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-utilities\") pod \"580e94d9-c525-4a0a-b965-6aefa59b2b64\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806265 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-catalog-content\") pod \"580e94d9-c525-4a0a-b965-6aefa59b2b64\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806351 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fws5\" (UniqueName: \"kubernetes.io/projected/580e94d9-c525-4a0a-b965-6aefa59b2b64-kube-api-access-2fws5\") pod \"580e94d9-c525-4a0a-b965-6aefa59b2b64\" (UID: \"580e94d9-c525-4a0a-b965-6aefa59b2b64\") " Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806568 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806582 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91386cf5-c3df-4e87-be1a-14989dee67f9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806593 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806601 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj5g5\" (UniqueName: \"kubernetes.io/projected/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-kube-api-access-pj5g5\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806611 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806620 4763 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/440365f2-877d-49bd-89c3-0dc4ad54efaa-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806637 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp7cg\" (UniqueName: \"kubernetes.io/projected/caa7c3a1-91cb-460a-a74e-3027d72cdfcb-kube-api-access-qp7cg\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806648 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mnpv\" (UniqueName: \"kubernetes.io/projected/440365f2-877d-49bd-89c3-0dc4ad54efaa-kube-api-access-2mnpv\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806660 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.806896 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-utilities" (OuterVolumeSpecName: "utilities") pod "580e94d9-c525-4a0a-b965-6aefa59b2b64" (UID: "580e94d9-c525-4a0a-b965-6aefa59b2b64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.808804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580e94d9-c525-4a0a-b965-6aefa59b2b64-kube-api-access-2fws5" (OuterVolumeSpecName: "kube-api-access-2fws5") pod "580e94d9-c525-4a0a-b965-6aefa59b2b64" (UID: "580e94d9-c525-4a0a-b965-6aefa59b2b64"). InnerVolumeSpecName "kube-api-access-2fws5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.849977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "580e94d9-c525-4a0a-b965-6aefa59b2b64" (UID: "580e94d9-c525-4a0a-b965-6aefa59b2b64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.907792 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fws5\" (UniqueName: \"kubernetes.io/projected/580e94d9-c525-4a0a-b965-6aefa59b2b64-kube-api-access-2fws5\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.907827 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.907857 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/580e94d9-c525-4a0a-b965-6aefa59b2b64-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:28 crc kubenswrapper[4763]: I1201 09:19:28.964012 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7mwbs"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.386300 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sctxq" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.386291 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sctxq" event={"ID":"580e94d9-c525-4a0a-b965-6aefa59b2b64","Type":"ContainerDied","Data":"8c4bbd0f5e63521c35475cebbc8fafee22623f088663ec4f635c0ab5e890d7b0"} Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.386777 4763 scope.go:117] "RemoveContainer" containerID="d8b6065f324c9484b221ae53039a3d2bff0ef86354e28875996f9bb494de3e25" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.387845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" event={"ID":"8ed98359-8184-409c-9f5d-f2b2b21b9cb7","Type":"ContainerStarted","Data":"8e4734f71db8d2b9f91751dca01ad2cdf68b5c056a9b57a13038a519a8af4606"} Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.387936 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" event={"ID":"8ed98359-8184-409c-9f5d-f2b2b21b9cb7","Type":"ContainerStarted","Data":"dbf6642c77761d1187aec2e4c1053d13f4ca91b0d228c780e379a1e679c3379f"} Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.388294 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.391589 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-42cb5" event={"ID":"91386cf5-c3df-4e87-be1a-14989dee67f9","Type":"ContainerDied","Data":"cf228e5378761ac938f9c99a1614061a0904b12765565319608630af82ce1337"} Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.391699 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-42cb5" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.394293 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" event={"ID":"440365f2-877d-49bd-89c3-0dc4ad54efaa","Type":"ContainerDied","Data":"06611798f10fac70e29aac80c1cf7acb414141fb41c1a1cfa208b1e3ab1df859"} Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.394378 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4pz4m" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.397030 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.403132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jkphf" event={"ID":"0ce2b6fa-b131-466e-9ee9-4c4672c9fa18","Type":"ContainerDied","Data":"d2fa62713016b86344bcd21ce87a1d4ea25bfd9c823cfad01532ff6d51e503ee"} Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.403261 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jkphf" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.412777 4763 scope.go:117] "RemoveContainer" containerID="556a1f6a5340318120f9d6ff9961232939919a893f866493f6fed58fec580159" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.414283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-df54r" event={"ID":"caa7c3a1-91cb-460a-a74e-3027d72cdfcb","Type":"ContainerDied","Data":"81df243cd5abcf7c4deac3e39858b2bcb957d8c8a6f5f709eb22e4c6f0c5370a"} Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.414394 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-df54r" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.414955 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7mwbs" podStartSLOduration=10.414942348 podStartE2EDuration="10.414942348s" podCreationTimestamp="2025-12-01 09:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:19:29.413068516 +0000 UTC m=+286.681717284" watchObservedRunningTime="2025-12-01 09:19:29.414942348 +0000 UTC m=+286.683591116" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.428269 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.435528 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sctxq"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.435706 4763 scope.go:117] "RemoveContainer" containerID="13dc130819411941112637d9944fb60f0554d01dec17ef65603bb396c2ee8309" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.441153 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sctxq"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.455270 4763 scope.go:117] "RemoveContainer" containerID="c1e9b443c8411aeea449c0e7041284cdf641d0c9c900e481e7bb8d17245de75e" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.455774 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4pz4m"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.462994 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4pz4m"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.481167 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jkphf"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.481492 4763 scope.go:117] "RemoveContainer" containerID="2cb5b794fbde1d1d273fddc3373c92b701a17602c9af0bbf620e81a1f4a47aae" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.484064 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jkphf"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.496770 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-df54r"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.505964 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-df54r"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.509958 4763 scope.go:117] "RemoveContainer" containerID="de9eee2c1fdb0d933e79765e710f57688f7bb089c76485523e74a2c506ede97a" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.517481 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-42cb5"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.520266 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-42cb5"] Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.527342 4763 scope.go:117] "RemoveContainer" containerID="1b38d30c5e9e4a640aad56b10d796b6c8a57d19e73f5fb5d15e2762f971d93e4" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.546219 4763 scope.go:117] "RemoveContainer" containerID="4c071ed3eacb58f842705d61e219c04dbf1b652174579c046f9064a3a4431fb0" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.560677 4763 scope.go:117] "RemoveContainer" containerID="99f9f4a3eec839e7a8aff5e0804736bb1e267b05f0c56b06ea347223091bfcd5" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.575492 4763 scope.go:117] "RemoveContainer" containerID="d018413111e40f7d6a9d99c251e7fec5fb8bf03a13daace19d25102e04076773" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.590670 4763 scope.go:117] "RemoveContainer" containerID="f55f0c680669d3e57ed0a0f6054014275a588ed82a0a72c670635e9efc3cf728" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.602376 4763 scope.go:117] "RemoveContainer" containerID="4c6ec7d1a606989ef39cb9bd7299209dbb76b639a21a976e69439e72560dbc7a" Dec 01 09:19:29 crc kubenswrapper[4763]: I1201 09:19:29.616205 4763 scope.go:117] "RemoveContainer" containerID="c399003af904226f6c8e09f611ab672c768c8658da09c3f999f6da3561a1295c" Dec 01 09:19:31 crc kubenswrapper[4763]: I1201 09:19:31.004151 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" path="/var/lib/kubelet/pods/0ce2b6fa-b131-466e-9ee9-4c4672c9fa18/volumes" Dec 01 09:19:31 crc kubenswrapper[4763]: I1201 09:19:31.004803 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440365f2-877d-49bd-89c3-0dc4ad54efaa" path="/var/lib/kubelet/pods/440365f2-877d-49bd-89c3-0dc4ad54efaa/volumes" Dec 01 09:19:31 crc kubenswrapper[4763]: I1201 09:19:31.005312 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" path="/var/lib/kubelet/pods/580e94d9-c525-4a0a-b965-6aefa59b2b64/volumes" Dec 01 09:19:31 crc kubenswrapper[4763]: I1201 09:19:31.006860 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" path="/var/lib/kubelet/pods/91386cf5-c3df-4e87-be1a-14989dee67f9/volumes" Dec 01 09:19:31 crc kubenswrapper[4763]: I1201 09:19:31.008217 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" path="/var/lib/kubelet/pods/caa7c3a1-91cb-460a-a74e-3027d72cdfcb/volumes" Dec 01 09:19:34 crc kubenswrapper[4763]: I1201 09:19:34.771600 4763 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 01 09:19:34 crc kubenswrapper[4763]: I1201 09:19:34.772406 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654" gracePeriod=5 Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.355517 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.356000 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.460755 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.460882 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.460907 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.460914 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.460944 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.460958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.460986 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.460993 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.461103 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.461216 4763 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.461230 4763 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.461240 4763 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.461249 4763 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.471986 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.484853 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.484914 4763 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654" exitCode=137 Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.484965 4763 scope.go:117] "RemoveContainer" containerID="b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.485015 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.514864 4763 scope.go:117] "RemoveContainer" containerID="b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654" Dec 01 09:19:40 crc kubenswrapper[4763]: E1201 09:19:40.515281 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654\": container with ID starting with b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654 not found: ID does not exist" containerID="b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.515309 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654"} err="failed to get container status \"b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654\": rpc error: code = NotFound desc = could not find container \"b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654\": container with ID starting with b5dc1c10e874c32d797d063d170777393c29d12d9f7e90a0fa6756b62f03c654 not found: ID does not exist" Dec 01 09:19:40 crc kubenswrapper[4763]: I1201 09:19:40.562193 4763 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:41 crc kubenswrapper[4763]: I1201 09:19:41.003478 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.148220 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qdpq8"] Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.148936 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" podUID="940c2b58-e113-4dc3-8717-6d6be27a033d" containerName="controller-manager" containerID="cri-o://7ac6af7c882815e3b8611f5d2ef5e310e87561295f3e3f82722aacb37b0f2513" gracePeriod=30 Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.249432 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7"] Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.249663 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" podUID="5e605f4b-743f-42b2-a437-64983f66992b" containerName="route-controller-manager" containerID="cri-o://0151da93d83d86563ff3b50513aed9e54ee8664c22c7c910a3e79035e35a5f5e" gracePeriod=30 Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.526661 4763 generic.go:334] "Generic (PLEG): container finished" podID="5e605f4b-743f-42b2-a437-64983f66992b" containerID="0151da93d83d86563ff3b50513aed9e54ee8664c22c7c910a3e79035e35a5f5e" exitCode=0 Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.527049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" event={"ID":"5e605f4b-743f-42b2-a437-64983f66992b","Type":"ContainerDied","Data":"0151da93d83d86563ff3b50513aed9e54ee8664c22c7c910a3e79035e35a5f5e"} Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.532161 4763 generic.go:334] "Generic (PLEG): container finished" podID="940c2b58-e113-4dc3-8717-6d6be27a033d" containerID="7ac6af7c882815e3b8611f5d2ef5e310e87561295f3e3f82722aacb37b0f2513" exitCode=0 Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.532200 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" event={"ID":"940c2b58-e113-4dc3-8717-6d6be27a033d","Type":"ContainerDied","Data":"7ac6af7c882815e3b8611f5d2ef5e310e87561295f3e3f82722aacb37b0f2513"} Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.532227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" event={"ID":"940c2b58-e113-4dc3-8717-6d6be27a033d","Type":"ContainerDied","Data":"2f175d6b4d31143bd1e8dc72aa37087ea7e1e2ac372c9c505a5e50f7544e8635"} Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.532237 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f175d6b4d31143bd1e8dc72aa37087ea7e1e2ac372c9c505a5e50f7544e8635" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.559957 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.650952 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/940c2b58-e113-4dc3-8717-6d6be27a033d-serving-cert\") pod \"940c2b58-e113-4dc3-8717-6d6be27a033d\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.651016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-config\") pod \"940c2b58-e113-4dc3-8717-6d6be27a033d\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.651077 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-client-ca\") pod \"940c2b58-e113-4dc3-8717-6d6be27a033d\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.651109 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbq9k\" (UniqueName: \"kubernetes.io/projected/940c2b58-e113-4dc3-8717-6d6be27a033d-kube-api-access-zbq9k\") pod \"940c2b58-e113-4dc3-8717-6d6be27a033d\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.651180 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-proxy-ca-bundles\") pod \"940c2b58-e113-4dc3-8717-6d6be27a033d\" (UID: \"940c2b58-e113-4dc3-8717-6d6be27a033d\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.652209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-client-ca" (OuterVolumeSpecName: "client-ca") pod "940c2b58-e113-4dc3-8717-6d6be27a033d" (UID: "940c2b58-e113-4dc3-8717-6d6be27a033d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.652345 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-config" (OuterVolumeSpecName: "config") pod "940c2b58-e113-4dc3-8717-6d6be27a033d" (UID: "940c2b58-e113-4dc3-8717-6d6be27a033d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.655835 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "940c2b58-e113-4dc3-8717-6d6be27a033d" (UID: "940c2b58-e113-4dc3-8717-6d6be27a033d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.661821 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940c2b58-e113-4dc3-8717-6d6be27a033d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "940c2b58-e113-4dc3-8717-6d6be27a033d" (UID: "940c2b58-e113-4dc3-8717-6d6be27a033d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.661961 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940c2b58-e113-4dc3-8717-6d6be27a033d-kube-api-access-zbq9k" (OuterVolumeSpecName: "kube-api-access-zbq9k") pod "940c2b58-e113-4dc3-8717-6d6be27a033d" (UID: "940c2b58-e113-4dc3-8717-6d6be27a033d"). InnerVolumeSpecName "kube-api-access-zbq9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.682691 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.751787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-config\") pod \"5e605f4b-743f-42b2-a437-64983f66992b\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.751852 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffgx2\" (UniqueName: \"kubernetes.io/projected/5e605f4b-743f-42b2-a437-64983f66992b-kube-api-access-ffgx2\") pod \"5e605f4b-743f-42b2-a437-64983f66992b\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.751887 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-client-ca\") pod \"5e605f4b-743f-42b2-a437-64983f66992b\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.752010 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e605f4b-743f-42b2-a437-64983f66992b-serving-cert\") pod \"5e605f4b-743f-42b2-a437-64983f66992b\" (UID: \"5e605f4b-743f-42b2-a437-64983f66992b\") " Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.752200 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/940c2b58-e113-4dc3-8717-6d6be27a033d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.752213 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.752221 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.752230 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbq9k\" (UniqueName: \"kubernetes.io/projected/940c2b58-e113-4dc3-8717-6d6be27a033d-kube-api-access-zbq9k\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.752270 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/940c2b58-e113-4dc3-8717-6d6be27a033d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.753042 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-client-ca" (OuterVolumeSpecName: "client-ca") pod "5e605f4b-743f-42b2-a437-64983f66992b" (UID: "5e605f4b-743f-42b2-a437-64983f66992b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.753080 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-config" (OuterVolumeSpecName: "config") pod "5e605f4b-743f-42b2-a437-64983f66992b" (UID: "5e605f4b-743f-42b2-a437-64983f66992b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.756180 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e605f4b-743f-42b2-a437-64983f66992b-kube-api-access-ffgx2" (OuterVolumeSpecName: "kube-api-access-ffgx2") pod "5e605f4b-743f-42b2-a437-64983f66992b" (UID: "5e605f4b-743f-42b2-a437-64983f66992b"). InnerVolumeSpecName "kube-api-access-ffgx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.756206 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e605f4b-743f-42b2-a437-64983f66992b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5e605f4b-743f-42b2-a437-64983f66992b" (UID: "5e605f4b-743f-42b2-a437-64983f66992b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.853674 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e605f4b-743f-42b2-a437-64983f66992b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.853719 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.853731 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffgx2\" (UniqueName: \"kubernetes.io/projected/5e605f4b-743f-42b2-a437-64983f66992b-kube-api-access-ffgx2\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:47 crc kubenswrapper[4763]: I1201 09:19:47.853744 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5e605f4b-743f-42b2-a437-64983f66992b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.537693 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-qdpq8" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.538380 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.542566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7" event={"ID":"5e605f4b-743f-42b2-a437-64983f66992b","Type":"ContainerDied","Data":"2dfb3f6e37e8453957e6d73555c517d1e897cb0c30b203a463ad0286a43eb86e"} Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.542617 4763 scope.go:117] "RemoveContainer" containerID="0151da93d83d86563ff3b50513aed9e54ee8664c22c7c910a3e79035e35a5f5e" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.596320 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7"] Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.612613 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s7nl7"] Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.623526 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qdpq8"] Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.631449 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-qdpq8"] Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717526 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-rvqhp"] Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717725 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717736 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717745 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerName="extract-utilities" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717753 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerName="extract-utilities" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717761 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717768 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717776 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerName="extract-content" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717781 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerName="extract-content" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717793 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940c2b58-e113-4dc3-8717-6d6be27a033d" containerName="controller-manager" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717799 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="940c2b58-e113-4dc3-8717-6d6be27a033d" containerName="controller-manager" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717808 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="extract-utilities" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717814 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="extract-utilities" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717823 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717828 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717837 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717843 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717850 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="extract-utilities" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717857 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="extract-utilities" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717864 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717869 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717877 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440365f2-877d-49bd-89c3-0dc4ad54efaa" containerName="marketplace-operator" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717884 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="440365f2-877d-49bd-89c3-0dc4ad54efaa" containerName="marketplace-operator" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717890 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerName="extract-utilities" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717896 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerName="extract-utilities" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717905 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerName="extract-content" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717911 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerName="extract-content" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717919 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="extract-content" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717925 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="extract-content" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717931 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e605f4b-743f-42b2-a437-64983f66992b" containerName="route-controller-manager" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717936 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e605f4b-743f-42b2-a437-64983f66992b" containerName="route-controller-manager" Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.717946 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="extract-content" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.717951 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="extract-content" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718037 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="580e94d9-c525-4a0a-b965-6aefa59b2b64" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718048 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce2b6fa-b131-466e-9ee9-4c4672c9fa18" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718056 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e605f4b-743f-42b2-a437-64983f66992b" containerName="route-controller-manager" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718067 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="91386cf5-c3df-4e87-be1a-14989dee67f9" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718074 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa7c3a1-91cb-460a-a74e-3027d72cdfcb" containerName="registry-server" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718083 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718090 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="940c2b58-e113-4dc3-8717-6d6be27a033d" containerName="controller-manager" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718099 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="440365f2-877d-49bd-89c3-0dc4ad54efaa" containerName="marketplace-operator" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.718533 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.721500 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.721984 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.722562 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.724775 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.724953 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.725532 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.734128 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.743402 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4"] Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.743982 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: W1201 09:19:48.745584 4763 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.745629 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 09:19:48 crc kubenswrapper[4763]: W1201 09:19:48.746863 4763 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.746888 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 09:19:48 crc kubenswrapper[4763]: W1201 09:19:48.747423 4763 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Dec 01 09:19:48 crc kubenswrapper[4763]: E1201 09:19:48.747469 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.747625 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.747869 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.748240 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.805096 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-rvqhp"] Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.866840 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4"] Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.891048 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-proxy-ca-bundles\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.891273 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgq4\" (UniqueName: \"kubernetes.io/projected/09a34e1b-4271-46b4-bff6-6a7426b16555-kube-api-access-6cgq4\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.891381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-config\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.891539 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-client-ca\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.891678 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-client-ca\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.891789 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09a34e1b-4271-46b4-bff6-6a7426b16555-serving-cert\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.891880 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.892004 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qttrh\" (UniqueName: \"kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.892119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-config\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993516 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09a34e1b-4271-46b4-bff6-6a7426b16555-serving-cert\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993815 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993849 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qttrh\" (UniqueName: \"kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993879 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-config\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgq4\" (UniqueName: \"kubernetes.io/projected/09a34e1b-4271-46b4-bff6-6a7426b16555-kube-api-access-6cgq4\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993917 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-proxy-ca-bundles\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993935 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-config\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993962 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-client-ca\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.993983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-client-ca\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.994829 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-client-ca\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.995561 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-client-ca\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.995647 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-config\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.996016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-config\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:48 crc kubenswrapper[4763]: I1201 09:19:48.996550 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-proxy-ca-bundles\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.002438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09a34e1b-4271-46b4-bff6-6a7426b16555-serving-cert\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.004167 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e605f4b-743f-42b2-a437-64983f66992b" path="/var/lib/kubelet/pods/5e605f4b-743f-42b2-a437-64983f66992b/volumes" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.004932 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940c2b58-e113-4dc3-8717-6d6be27a033d" path="/var/lib/kubelet/pods/940c2b58-e113-4dc3-8717-6d6be27a033d/volumes" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.016828 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgq4\" (UniqueName: \"kubernetes.io/projected/09a34e1b-4271-46b4-bff6-6a7426b16555-kube-api-access-6cgq4\") pod \"controller-manager-884c54fcb-rvqhp\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.030712 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.237386 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-rvqhp"] Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.250209 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4"] Dec 01 09:19:49 crc kubenswrapper[4763]: E1201 09:19:49.257874 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qttrh serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" podUID="e848875d-5c33-4ca4-b9c6-c9d2111fdb17" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.445519 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-rvqhp"] Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.543500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" event={"ID":"09a34e1b-4271-46b4-bff6-6a7426b16555","Type":"ContainerStarted","Data":"8a43cff8156234a4e024dc30b331c739dbb5048a29706a993ef02f7f8df8a8d5"} Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.544639 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.563347 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.701747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-client-ca\") pod \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.701824 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-config\") pod \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.702604 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-config" (OuterVolumeSpecName: "config") pod "e848875d-5c33-4ca4-b9c6-c9d2111fdb17" (UID: "e848875d-5c33-4ca4-b9c6-c9d2111fdb17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.702804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-client-ca" (OuterVolumeSpecName: "client-ca") pod "e848875d-5c33-4ca4-b9c6-c9d2111fdb17" (UID: "e848875d-5c33-4ca4-b9c6-c9d2111fdb17"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.802779 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:49 crc kubenswrapper[4763]: I1201 09:19:49.802822 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:49 crc kubenswrapper[4763]: E1201 09:19:49.994052 4763 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 01 09:19:49 crc kubenswrapper[4763]: E1201 09:19:49.994149 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert podName:e848875d-5c33-4ca4-b9c6-c9d2111fdb17 nodeName:}" failed. No retries permitted until 2025-12-01 09:19:50.494128378 +0000 UTC m=+307.762777136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert") pod "route-controller-manager-7656c99f76-vkdg4" (UID: "e848875d-5c33-4ca4-b9c6-c9d2111fdb17") : failed to sync secret cache: timed out waiting for the condition Dec 01 09:19:50 crc kubenswrapper[4763]: E1201 09:19:50.020967 4763 projected.go:288] Couldn't get configMap openshift-route-controller-manager/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 01 09:19:50 crc kubenswrapper[4763]: E1201 09:19:50.020997 4763 projected.go:194] Error preparing data for projected volume kube-api-access-qttrh for pod openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4: failed to sync configmap cache: timed out waiting for the condition Dec 01 09:19:50 crc kubenswrapper[4763]: E1201 09:19:50.021059 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh podName:e848875d-5c33-4ca4-b9c6-c9d2111fdb17 nodeName:}" failed. No retries permitted until 2025-12-01 09:19:50.521041256 +0000 UTC m=+307.789690014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qttrh" (UniqueName: "kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh") pod "route-controller-manager-7656c99f76-vkdg4" (UID: "e848875d-5c33-4ca4-b9c6-c9d2111fdb17") : failed to sync configmap cache: timed out waiting for the condition Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.102564 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.118611 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.322685 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.516117 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.520681 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.552272 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.552277 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" event={"ID":"09a34e1b-4271-46b4-bff6-6a7426b16555","Type":"ContainerStarted","Data":"91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185"} Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.552655 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" podUID="09a34e1b-4271-46b4-bff6-6a7426b16555" containerName="controller-manager" containerID="cri-o://91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185" gracePeriod=30 Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.577489 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" podStartSLOduration=2.577473671 podStartE2EDuration="2.577473671s" podCreationTimestamp="2025-12-01 09:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:19:50.572088011 +0000 UTC m=+307.840736779" watchObservedRunningTime="2025-12-01 09:19:50.577473671 +0000 UTC m=+307.846122439" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.617095 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert\") pod \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.617344 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qttrh\" (UniqueName: \"kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.620999 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e848875d-5c33-4ca4-b9c6-c9d2111fdb17" (UID: "e848875d-5c33-4ca4-b9c6-c9d2111fdb17"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.621039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qttrh\" (UniqueName: \"kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh\") pod \"route-controller-manager-7656c99f76-vkdg4\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.717783 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qttrh\" (UniqueName: \"kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh\") pod \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\" (UID: \"e848875d-5c33-4ca4-b9c6-c9d2111fdb17\") " Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.718109 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.720767 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh" (OuterVolumeSpecName: "kube-api-access-qttrh") pod "e848875d-5c33-4ca4-b9c6-c9d2111fdb17" (UID: "e848875d-5c33-4ca4-b9c6-c9d2111fdb17"). InnerVolumeSpecName "kube-api-access-qttrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.819648 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qttrh\" (UniqueName: \"kubernetes.io/projected/e848875d-5c33-4ca4-b9c6-c9d2111fdb17-kube-api-access-qttrh\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.872436 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.898436 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv"] Dec 01 09:19:50 crc kubenswrapper[4763]: E1201 09:19:50.898697 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a34e1b-4271-46b4-bff6-6a7426b16555" containerName="controller-manager" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.898715 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a34e1b-4271-46b4-bff6-6a7426b16555" containerName="controller-manager" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.898807 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a34e1b-4271-46b4-bff6-6a7426b16555" containerName="controller-manager" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.899187 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.901631 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.905534 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.905702 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.905808 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.905902 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.906112 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.910012 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4"] Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.913055 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7656c99f76-vkdg4"] Dec 01 09:19:50 crc kubenswrapper[4763]: I1201 09:19:50.917810 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv"] Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.001032 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e848875d-5c33-4ca4-b9c6-c9d2111fdb17" path="/var/lib/kubelet/pods/e848875d-5c33-4ca4-b9c6-c9d2111fdb17/volumes" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.021980 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cgq4\" (UniqueName: \"kubernetes.io/projected/09a34e1b-4271-46b4-bff6-6a7426b16555-kube-api-access-6cgq4\") pod \"09a34e1b-4271-46b4-bff6-6a7426b16555\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.022080 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-config\") pod \"09a34e1b-4271-46b4-bff6-6a7426b16555\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.022134 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09a34e1b-4271-46b4-bff6-6a7426b16555-serving-cert\") pod \"09a34e1b-4271-46b4-bff6-6a7426b16555\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.022153 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-client-ca\") pod \"09a34e1b-4271-46b4-bff6-6a7426b16555\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.022169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-proxy-ca-bundles\") pod \"09a34e1b-4271-46b4-bff6-6a7426b16555\" (UID: \"09a34e1b-4271-46b4-bff6-6a7426b16555\") " Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.022287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-config\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.022322 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5008358d-1fd7-4aa1-ab44-3aa248f575a5-serving-cert\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.022364 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-client-ca\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.022396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4bdx\" (UniqueName: \"kubernetes.io/projected/5008358d-1fd7-4aa1-ab44-3aa248f575a5-kube-api-access-w4bdx\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.023209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-client-ca" (OuterVolumeSpecName: "client-ca") pod "09a34e1b-4271-46b4-bff6-6a7426b16555" (UID: "09a34e1b-4271-46b4-bff6-6a7426b16555"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.023555 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-config" (OuterVolumeSpecName: "config") pod "09a34e1b-4271-46b4-bff6-6a7426b16555" (UID: "09a34e1b-4271-46b4-bff6-6a7426b16555"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.025391 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a34e1b-4271-46b4-bff6-6a7426b16555-kube-api-access-6cgq4" (OuterVolumeSpecName: "kube-api-access-6cgq4") pod "09a34e1b-4271-46b4-bff6-6a7426b16555" (UID: "09a34e1b-4271-46b4-bff6-6a7426b16555"). InnerVolumeSpecName "kube-api-access-6cgq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.023201 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "09a34e1b-4271-46b4-bff6-6a7426b16555" (UID: "09a34e1b-4271-46b4-bff6-6a7426b16555"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.025665 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a34e1b-4271-46b4-bff6-6a7426b16555-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09a34e1b-4271-46b4-bff6-6a7426b16555" (UID: "09a34e1b-4271-46b4-bff6-6a7426b16555"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123123 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-client-ca\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123184 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4bdx\" (UniqueName: \"kubernetes.io/projected/5008358d-1fd7-4aa1-ab44-3aa248f575a5-kube-api-access-w4bdx\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123208 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-config\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5008358d-1fd7-4aa1-ab44-3aa248f575a5-serving-cert\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123274 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cgq4\" (UniqueName: \"kubernetes.io/projected/09a34e1b-4271-46b4-bff6-6a7426b16555-kube-api-access-6cgq4\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123287 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123296 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09a34e1b-4271-46b4-bff6-6a7426b16555-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123304 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.123312 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/09a34e1b-4271-46b4-bff6-6a7426b16555-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.124678 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-config\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.125079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-client-ca\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.127506 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5008358d-1fd7-4aa1-ab44-3aa248f575a5-serving-cert\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.141097 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4bdx\" (UniqueName: \"kubernetes.io/projected/5008358d-1fd7-4aa1-ab44-3aa248f575a5-kube-api-access-w4bdx\") pod \"route-controller-manager-6ffd8f858-6jdhv\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.216530 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.473110 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv"] Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.559805 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" event={"ID":"5008358d-1fd7-4aa1-ab44-3aa248f575a5","Type":"ContainerStarted","Data":"636e4c1f2b648aab204b3757a8731b0f8ddadba3ffa931e33ff5df26f784b8a2"} Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.561508 4763 generic.go:334] "Generic (PLEG): container finished" podID="09a34e1b-4271-46b4-bff6-6a7426b16555" containerID="91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185" exitCode=0 Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.561534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" event={"ID":"09a34e1b-4271-46b4-bff6-6a7426b16555","Type":"ContainerDied","Data":"91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185"} Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.561550 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" event={"ID":"09a34e1b-4271-46b4-bff6-6a7426b16555","Type":"ContainerDied","Data":"8a43cff8156234a4e024dc30b331c739dbb5048a29706a993ef02f7f8df8a8d5"} Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.561565 4763 scope.go:117] "RemoveContainer" containerID="91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.561596 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-884c54fcb-rvqhp" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.587349 4763 scope.go:117] "RemoveContainer" containerID="91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185" Dec 01 09:19:51 crc kubenswrapper[4763]: E1201 09:19:51.591598 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185\": container with ID starting with 91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185 not found: ID does not exist" containerID="91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.591643 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185"} err="failed to get container status \"91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185\": rpc error: code = NotFound desc = could not find container \"91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185\": container with ID starting with 91aaf3384ebf449db4f55fba26037d0370c04584119c2b24ffe2b89fdec9b185 not found: ID does not exist" Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.595189 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-rvqhp"] Dec 01 09:19:51 crc kubenswrapper[4763]: I1201 09:19:51.598301 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-rvqhp"] Dec 01 09:19:52 crc kubenswrapper[4763]: I1201 09:19:52.568812 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" event={"ID":"5008358d-1fd7-4aa1-ab44-3aa248f575a5","Type":"ContainerStarted","Data":"cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de"} Dec 01 09:19:52 crc kubenswrapper[4763]: I1201 09:19:52.570275 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:52 crc kubenswrapper[4763]: I1201 09:19:52.573910 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:19:52 crc kubenswrapper[4763]: I1201 09:19:52.616902 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" podStartSLOduration=3.6168850299999997 podStartE2EDuration="3.61688503s" podCreationTimestamp="2025-12-01 09:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:19:52.599777815 +0000 UTC m=+309.868426603" watchObservedRunningTime="2025-12-01 09:19:52.61688503 +0000 UTC m=+309.885533818" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.000636 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a34e1b-4271-46b4-bff6-6a7426b16555" path="/var/lib/kubelet/pods/09a34e1b-4271-46b4-bff6-6a7426b16555/volumes" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.063037 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-699fc888d4-x74wv"] Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.063779 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.066521 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.066889 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.067095 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.067384 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.067645 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.067811 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.074687 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.086100 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699fc888d4-x74wv"] Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.189407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-proxy-ca-bundles\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.189503 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp9c4\" (UniqueName: \"kubernetes.io/projected/2c059dd2-cdca-4f0f-9978-e4838fa5c325-kube-api-access-rp9c4\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.189552 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c059dd2-cdca-4f0f-9978-e4838fa5c325-serving-cert\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.189588 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-client-ca\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.189625 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-config\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.291091 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c059dd2-cdca-4f0f-9978-e4838fa5c325-serving-cert\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.291159 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-client-ca\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.291200 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-config\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.291219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-proxy-ca-bundles\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.291270 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp9c4\" (UniqueName: \"kubernetes.io/projected/2c059dd2-cdca-4f0f-9978-e4838fa5c325-kube-api-access-rp9c4\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.293216 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-client-ca\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.293951 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-config\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.294869 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-proxy-ca-bundles\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.298014 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c059dd2-cdca-4f0f-9978-e4838fa5c325-serving-cert\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.307896 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp9c4\" (UniqueName: \"kubernetes.io/projected/2c059dd2-cdca-4f0f-9978-e4838fa5c325-kube-api-access-rp9c4\") pod \"controller-manager-699fc888d4-x74wv\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.384691 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:53 crc kubenswrapper[4763]: I1201 09:19:53.576155 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699fc888d4-x74wv"] Dec 01 09:19:53 crc kubenswrapper[4763]: W1201 09:19:53.592052 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c059dd2_cdca_4f0f_9978_e4838fa5c325.slice/crio-8574e55aa7ad7f6b32766021fc71d447a7ad28a1f370fc4f8197376562a383ff WatchSource:0}: Error finding container 8574e55aa7ad7f6b32766021fc71d447a7ad28a1f370fc4f8197376562a383ff: Status 404 returned error can't find the container with id 8574e55aa7ad7f6b32766021fc71d447a7ad28a1f370fc4f8197376562a383ff Dec 01 09:19:54 crc kubenswrapper[4763]: I1201 09:19:54.590175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" event={"ID":"2c059dd2-cdca-4f0f-9978-e4838fa5c325","Type":"ContainerStarted","Data":"ac3d1dc31e5df3ea5cda89e1bb752e2721e963381bfa03ae1699d14e8a0dffc2"} Dec 01 09:19:54 crc kubenswrapper[4763]: I1201 09:19:54.590544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" event={"ID":"2c059dd2-cdca-4f0f-9978-e4838fa5c325","Type":"ContainerStarted","Data":"8574e55aa7ad7f6b32766021fc71d447a7ad28a1f370fc4f8197376562a383ff"} Dec 01 09:19:54 crc kubenswrapper[4763]: I1201 09:19:54.590566 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:54 crc kubenswrapper[4763]: I1201 09:19:54.597099 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:19:54 crc kubenswrapper[4763]: I1201 09:19:54.614560 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" podStartSLOduration=5.614543969 podStartE2EDuration="5.614543969s" podCreationTimestamp="2025-12-01 09:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:19:54.610105916 +0000 UTC m=+311.878754694" watchObservedRunningTime="2025-12-01 09:19:54.614543969 +0000 UTC m=+311.883192737" Dec 01 09:20:01 crc kubenswrapper[4763]: I1201 09:20:01.938633 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkpnv"] Dec 01 09:20:01 crc kubenswrapper[4763]: I1201 09:20:01.940405 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:01 crc kubenswrapper[4763]: I1201 09:20:01.942425 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 01 09:20:01 crc kubenswrapper[4763]: I1201 09:20:01.994421 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkpnv"] Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.046160 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pnkw4"] Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.046970 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.069590 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pnkw4"] Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.107232 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-utilities\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.107278 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-catalog-content\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.107315 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8q7q\" (UniqueName: \"kubernetes.io/projected/ff859352-a99d-4a67-9126-ec6a056b3236-kube-api-access-h8q7q\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.132093 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hc4lf"] Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.133713 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.138433 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.149717 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hc4lf"] Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.208683 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8q7q\" (UniqueName: \"kubernetes.io/projected/ff859352-a99d-4a67-9126-ec6a056b3236-kube-api-access-h8q7q\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.208761 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-registry-tls\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.208797 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fee01654-c09e-4235-aefe-dfdf38ebc98c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.208852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gntq\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-kube-api-access-7gntq\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.208879 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fee01654-c09e-4235-aefe-dfdf38ebc98c-registry-certificates\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.208909 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-bound-sa-token\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.208940 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.208984 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fee01654-c09e-4235-aefe-dfdf38ebc98c-trusted-ca\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.209006 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-utilities\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.209026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-catalog-content\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.209061 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fee01654-c09e-4235-aefe-dfdf38ebc98c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.210253 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-utilities\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.210565 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-catalog-content\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.237048 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.239098 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8q7q\" (UniqueName: \"kubernetes.io/projected/ff859352-a99d-4a67-9126-ec6a056b3236-kube-api-access-h8q7q\") pod \"redhat-operators-xkpnv\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.256346 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310260 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-registry-tls\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310311 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fee01654-c09e-4235-aefe-dfdf38ebc98c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310342 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gntq\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-kube-api-access-7gntq\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09b38c8-91a6-45ed-b97b-d0370e99ab11-utilities\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310385 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fee01654-c09e-4235-aefe-dfdf38ebc98c-registry-certificates\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310405 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-bound-sa-token\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310430 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09b38c8-91a6-45ed-b97b-d0370e99ab11-catalog-content\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310519 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fee01654-c09e-4235-aefe-dfdf38ebc98c-trusted-ca\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv98b\" (UniqueName: \"kubernetes.io/projected/a09b38c8-91a6-45ed-b97b-d0370e99ab11-kube-api-access-fv98b\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.310578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fee01654-c09e-4235-aefe-dfdf38ebc98c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.311713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fee01654-c09e-4235-aefe-dfdf38ebc98c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.312908 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fee01654-c09e-4235-aefe-dfdf38ebc98c-registry-certificates\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.315710 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-registry-tls\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.316893 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fee01654-c09e-4235-aefe-dfdf38ebc98c-trusted-ca\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.318949 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fee01654-c09e-4235-aefe-dfdf38ebc98c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.327613 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gntq\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-kube-api-access-7gntq\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.332271 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fee01654-c09e-4235-aefe-dfdf38ebc98c-bound-sa-token\") pod \"image-registry-66df7c8f76-pnkw4\" (UID: \"fee01654-c09e-4235-aefe-dfdf38ebc98c\") " pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.370986 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.412006 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09b38c8-91a6-45ed-b97b-d0370e99ab11-utilities\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.412072 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09b38c8-91a6-45ed-b97b-d0370e99ab11-catalog-content\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.412107 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv98b\" (UniqueName: \"kubernetes.io/projected/a09b38c8-91a6-45ed-b97b-d0370e99ab11-kube-api-access-fv98b\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.412448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09b38c8-91a6-45ed-b97b-d0370e99ab11-utilities\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.412586 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09b38c8-91a6-45ed-b97b-d0370e99ab11-catalog-content\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.431170 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv98b\" (UniqueName: \"kubernetes.io/projected/a09b38c8-91a6-45ed-b97b-d0370e99ab11-kube-api-access-fv98b\") pod \"certified-operators-hc4lf\" (UID: \"a09b38c8-91a6-45ed-b97b-d0370e99ab11\") " pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.447015 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.686912 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkpnv"] Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.803975 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-pnkw4"] Dec 01 09:20:02 crc kubenswrapper[4763]: W1201 09:20:02.811328 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee01654_c09e_4235_aefe_dfdf38ebc98c.slice/crio-6e4be77ddd9ccaeed438ed265ef9a7fd81b3a5e38d50ee19446c2d0d03cedd0d WatchSource:0}: Error finding container 6e4be77ddd9ccaeed438ed265ef9a7fd81b3a5e38d50ee19446c2d0d03cedd0d: Status 404 returned error can't find the container with id 6e4be77ddd9ccaeed438ed265ef9a7fd81b3a5e38d50ee19446c2d0d03cedd0d Dec 01 09:20:02 crc kubenswrapper[4763]: I1201 09:20:02.878490 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hc4lf"] Dec 01 09:20:02 crc kubenswrapper[4763]: W1201 09:20:02.885673 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda09b38c8_91a6_45ed_b97b_d0370e99ab11.slice/crio-3ccf35ecefaaf12059b18238bf4469d2410b190cfdaba6dec9c65a1f246c0182 WatchSource:0}: Error finding container 3ccf35ecefaaf12059b18238bf4469d2410b190cfdaba6dec9c65a1f246c0182: Status 404 returned error can't find the container with id 3ccf35ecefaaf12059b18238bf4469d2410b190cfdaba6dec9c65a1f246c0182 Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.633653 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" event={"ID":"fee01654-c09e-4235-aefe-dfdf38ebc98c","Type":"ContainerStarted","Data":"bc89292702468c7f0e2ec3b4b7119d071a2461e438254c5f5cfa654bc1cf752c"} Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.633987 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.634001 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" event={"ID":"fee01654-c09e-4235-aefe-dfdf38ebc98c","Type":"ContainerStarted","Data":"6e4be77ddd9ccaeed438ed265ef9a7fd81b3a5e38d50ee19446c2d0d03cedd0d"} Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.636233 4763 generic.go:334] "Generic (PLEG): container finished" podID="a09b38c8-91a6-45ed-b97b-d0370e99ab11" containerID="201d011f045db82b178cdde8fd050d47462097c28168ea87b399c674decaf55a" exitCode=0 Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.636286 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hc4lf" event={"ID":"a09b38c8-91a6-45ed-b97b-d0370e99ab11","Type":"ContainerDied","Data":"201d011f045db82b178cdde8fd050d47462097c28168ea87b399c674decaf55a"} Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.636357 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hc4lf" event={"ID":"a09b38c8-91a6-45ed-b97b-d0370e99ab11","Type":"ContainerStarted","Data":"3ccf35ecefaaf12059b18238bf4469d2410b190cfdaba6dec9c65a1f246c0182"} Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.639797 4763 generic.go:334] "Generic (PLEG): container finished" podID="ff859352-a99d-4a67-9126-ec6a056b3236" containerID="2175e774b58a169cb873945fb1b793f2cdbf5d40c186c621572258c9b2c969a0" exitCode=0 Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.639846 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkpnv" event={"ID":"ff859352-a99d-4a67-9126-ec6a056b3236","Type":"ContainerDied","Data":"2175e774b58a169cb873945fb1b793f2cdbf5d40c186c621572258c9b2c969a0"} Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.639872 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkpnv" event={"ID":"ff859352-a99d-4a67-9126-ec6a056b3236","Type":"ContainerStarted","Data":"2a58ab9533de7aea397bf5b3ce3f5cbb58896123a3a23980e0498490220bb553"} Dec 01 09:20:03 crc kubenswrapper[4763]: I1201 09:20:03.685309 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" podStartSLOduration=1.685289195 podStartE2EDuration="1.685289195s" podCreationTimestamp="2025-12-01 09:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:20:03.661273488 +0000 UTC m=+320.929922276" watchObservedRunningTime="2025-12-01 09:20:03.685289195 +0000 UTC m=+320.953937963" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.336442 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f454l"] Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.337736 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.339833 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.358969 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f454l"] Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.438329 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-catalog-content\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.438418 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-utilities\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.438757 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxtld\" (UniqueName: \"kubernetes.io/projected/42e4fb80-00e7-430a-8154-e4e581437bfe-kube-api-access-sxtld\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.533258 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfqr"] Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.534201 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.536597 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.539819 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxtld\" (UniqueName: \"kubernetes.io/projected/42e4fb80-00e7-430a-8154-e4e581437bfe-kube-api-access-sxtld\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.539870 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-catalog-content\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.539902 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-utilities\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.540777 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-utilities\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.541611 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-catalog-content\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.562849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxtld\" (UniqueName: \"kubernetes.io/projected/42e4fb80-00e7-430a-8154-e4e581437bfe-kube-api-access-sxtld\") pod \"community-operators-f454l\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.576789 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfqr"] Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.642717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52470a23-807a-48bd-968d-eb43cb36b804-utilities\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.642777 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52470a23-807a-48bd-968d-eb43cb36b804-catalog-content\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.642828 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dl4\" (UniqueName: \"kubernetes.io/projected/52470a23-807a-48bd-968d-eb43cb36b804-kube-api-access-s7dl4\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.646382 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkpnv" event={"ID":"ff859352-a99d-4a67-9126-ec6a056b3236","Type":"ContainerStarted","Data":"c4e5c01e829785d3129ebb69ce64dba2c1b7e9f1450fa9bb895cffb5a0e5fcff"} Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.676203 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.744423 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52470a23-807a-48bd-968d-eb43cb36b804-utilities\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.744490 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52470a23-807a-48bd-968d-eb43cb36b804-catalog-content\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.744547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dl4\" (UniqueName: \"kubernetes.io/projected/52470a23-807a-48bd-968d-eb43cb36b804-kube-api-access-s7dl4\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.745006 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52470a23-807a-48bd-968d-eb43cb36b804-utilities\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.745729 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52470a23-807a-48bd-968d-eb43cb36b804-catalog-content\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.792155 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dl4\" (UniqueName: \"kubernetes.io/projected/52470a23-807a-48bd-968d-eb43cb36b804-kube-api-access-s7dl4\") pod \"redhat-marketplace-xwfqr\" (UID: \"52470a23-807a-48bd-968d-eb43cb36b804\") " pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:04 crc kubenswrapper[4763]: I1201 09:20:04.848888 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.107801 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f454l"] Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.252964 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfqr"] Dec 01 09:20:05 crc kubenswrapper[4763]: W1201 09:20:05.305224 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52470a23_807a_48bd_968d_eb43cb36b804.slice/crio-2a7ccad6d99c62a22b07b029e0a31da9107391b864930d2a11d3706e8e858e14 WatchSource:0}: Error finding container 2a7ccad6d99c62a22b07b029e0a31da9107391b864930d2a11d3706e8e858e14: Status 404 returned error can't find the container with id 2a7ccad6d99c62a22b07b029e0a31da9107391b864930d2a11d3706e8e858e14 Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.653785 4763 generic.go:334] "Generic (PLEG): container finished" podID="ff859352-a99d-4a67-9126-ec6a056b3236" containerID="c4e5c01e829785d3129ebb69ce64dba2c1b7e9f1450fa9bb895cffb5a0e5fcff" exitCode=0 Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.653870 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkpnv" event={"ID":"ff859352-a99d-4a67-9126-ec6a056b3236","Type":"ContainerDied","Data":"c4e5c01e829785d3129ebb69ce64dba2c1b7e9f1450fa9bb895cffb5a0e5fcff"} Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.656307 4763 generic.go:334] "Generic (PLEG): container finished" podID="52470a23-807a-48bd-968d-eb43cb36b804" containerID="e7b295f70b9583a434a15ee21a10a68ee675c157d924fcfabeb951ad749ada5a" exitCode=0 Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.656370 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfqr" event={"ID":"52470a23-807a-48bd-968d-eb43cb36b804","Type":"ContainerDied","Data":"e7b295f70b9583a434a15ee21a10a68ee675c157d924fcfabeb951ad749ada5a"} Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.656411 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfqr" event={"ID":"52470a23-807a-48bd-968d-eb43cb36b804","Type":"ContainerStarted","Data":"2a7ccad6d99c62a22b07b029e0a31da9107391b864930d2a11d3706e8e858e14"} Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.658367 4763 generic.go:334] "Generic (PLEG): container finished" podID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerID="51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3" exitCode=0 Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.658447 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f454l" event={"ID":"42e4fb80-00e7-430a-8154-e4e581437bfe","Type":"ContainerDied","Data":"51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3"} Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.658508 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f454l" event={"ID":"42e4fb80-00e7-430a-8154-e4e581437bfe","Type":"ContainerStarted","Data":"d7a2b2ff421d8ab63b41595814424dc6bc3b41d1bf393659160ebeb72cad6008"} Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.663302 4763 generic.go:334] "Generic (PLEG): container finished" podID="a09b38c8-91a6-45ed-b97b-d0370e99ab11" containerID="bf030d997f144129418b42a808fd180762a4f3f5a04ef34ef56b0662d20ec314" exitCode=0 Dec 01 09:20:05 crc kubenswrapper[4763]: I1201 09:20:05.663341 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hc4lf" event={"ID":"a09b38c8-91a6-45ed-b97b-d0370e99ab11","Type":"ContainerDied","Data":"bf030d997f144129418b42a808fd180762a4f3f5a04ef34ef56b0662d20ec314"} Dec 01 09:20:06 crc kubenswrapper[4763]: I1201 09:20:06.691188 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hc4lf" event={"ID":"a09b38c8-91a6-45ed-b97b-d0370e99ab11","Type":"ContainerStarted","Data":"3ff52f886b6d600ffcd828b23e6b34f927df77e5159b06a6650038b8186afc05"} Dec 01 09:20:06 crc kubenswrapper[4763]: I1201 09:20:06.716592 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hc4lf" podStartSLOduration=2.301632576 podStartE2EDuration="4.71657729s" podCreationTimestamp="2025-12-01 09:20:02 +0000 UTC" firstStartedPulling="2025-12-01 09:20:03.638111533 +0000 UTC m=+320.906760301" lastFinishedPulling="2025-12-01 09:20:06.053056247 +0000 UTC m=+323.321705015" observedRunningTime="2025-12-01 09:20:06.710289926 +0000 UTC m=+323.978938694" watchObservedRunningTime="2025-12-01 09:20:06.71657729 +0000 UTC m=+323.985226058" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.154479 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv"] Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.154876 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" podUID="5008358d-1fd7-4aa1-ab44-3aa248f575a5" containerName="route-controller-manager" containerID="cri-o://cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de" gracePeriod=30 Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.607362 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.700930 4763 generic.go:334] "Generic (PLEG): container finished" podID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerID="e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f" exitCode=0 Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.700987 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f454l" event={"ID":"42e4fb80-00e7-430a-8154-e4e581437bfe","Type":"ContainerDied","Data":"e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f"} Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.711843 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkpnv" event={"ID":"ff859352-a99d-4a67-9126-ec6a056b3236","Type":"ContainerStarted","Data":"b9f7d7429fafb5c44e334969e332b7a533980dea8d8a59c154298f44997d6bd9"} Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.716336 4763 generic.go:334] "Generic (PLEG): container finished" podID="52470a23-807a-48bd-968d-eb43cb36b804" containerID="b5a80a214b14aa5d0ce693daa3aa21cd70656d368eeb420a20a99ed091aefcf7" exitCode=0 Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.716408 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfqr" event={"ID":"52470a23-807a-48bd-968d-eb43cb36b804","Type":"ContainerDied","Data":"b5a80a214b14aa5d0ce693daa3aa21cd70656d368eeb420a20a99ed091aefcf7"} Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.720829 4763 generic.go:334] "Generic (PLEG): container finished" podID="5008358d-1fd7-4aa1-ab44-3aa248f575a5" containerID="cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de" exitCode=0 Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.721447 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.721608 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" event={"ID":"5008358d-1fd7-4aa1-ab44-3aa248f575a5","Type":"ContainerDied","Data":"cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de"} Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.721629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv" event={"ID":"5008358d-1fd7-4aa1-ab44-3aa248f575a5","Type":"ContainerDied","Data":"636e4c1f2b648aab204b3757a8731b0f8ddadba3ffa931e33ff5df26f784b8a2"} Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.721645 4763 scope.go:117] "RemoveContainer" containerID="cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.749822 4763 scope.go:117] "RemoveContainer" containerID="cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de" Dec 01 09:20:07 crc kubenswrapper[4763]: E1201 09:20:07.750285 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de\": container with ID starting with cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de not found: ID does not exist" containerID="cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.750319 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de"} err="failed to get container status \"cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de\": rpc error: code = NotFound desc = could not find container \"cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de\": container with ID starting with cf3a887c1c7ad000fb459dc4125fdeb83ae898ee2a333b3efd4c1566865c37de not found: ID does not exist" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.789281 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4bdx\" (UniqueName: \"kubernetes.io/projected/5008358d-1fd7-4aa1-ab44-3aa248f575a5-kube-api-access-w4bdx\") pod \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.789359 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-config\") pod \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.789412 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5008358d-1fd7-4aa1-ab44-3aa248f575a5-serving-cert\") pod \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.789570 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-client-ca\") pod \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\" (UID: \"5008358d-1fd7-4aa1-ab44-3aa248f575a5\") " Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.790087 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-config" (OuterVolumeSpecName: "config") pod "5008358d-1fd7-4aa1-ab44-3aa248f575a5" (UID: "5008358d-1fd7-4aa1-ab44-3aa248f575a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.790511 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-client-ca" (OuterVolumeSpecName: "client-ca") pod "5008358d-1fd7-4aa1-ab44-3aa248f575a5" (UID: "5008358d-1fd7-4aa1-ab44-3aa248f575a5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.791332 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkpnv" podStartSLOduration=3.7205821869999998 podStartE2EDuration="6.791319091s" podCreationTimestamp="2025-12-01 09:20:01 +0000 UTC" firstStartedPulling="2025-12-01 09:20:03.641481146 +0000 UTC m=+320.910129914" lastFinishedPulling="2025-12-01 09:20:06.71221805 +0000 UTC m=+323.980866818" observedRunningTime="2025-12-01 09:20:07.786465886 +0000 UTC m=+325.055114654" watchObservedRunningTime="2025-12-01 09:20:07.791319091 +0000 UTC m=+325.059967869" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.798181 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5008358d-1fd7-4aa1-ab44-3aa248f575a5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5008358d-1fd7-4aa1-ab44-3aa248f575a5" (UID: "5008358d-1fd7-4aa1-ab44-3aa248f575a5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.799402 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5008358d-1fd7-4aa1-ab44-3aa248f575a5-kube-api-access-w4bdx" (OuterVolumeSpecName: "kube-api-access-w4bdx") pod "5008358d-1fd7-4aa1-ab44-3aa248f575a5" (UID: "5008358d-1fd7-4aa1-ab44-3aa248f575a5"). InnerVolumeSpecName "kube-api-access-w4bdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.891327 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.891364 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5008358d-1fd7-4aa1-ab44-3aa248f575a5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.891377 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5008358d-1fd7-4aa1-ab44-3aa248f575a5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:07 crc kubenswrapper[4763]: I1201 09:20:07.891387 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4bdx\" (UniqueName: \"kubernetes.io/projected/5008358d-1fd7-4aa1-ab44-3aa248f575a5-kube-api-access-w4bdx\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:08 crc kubenswrapper[4763]: I1201 09:20:08.048886 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv"] Dec 01 09:20:08 crc kubenswrapper[4763]: I1201 09:20:08.052525 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ffd8f858-6jdhv"] Dec 01 09:20:08 crc kubenswrapper[4763]: I1201 09:20:08.728094 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f454l" event={"ID":"42e4fb80-00e7-430a-8154-e4e581437bfe","Type":"ContainerStarted","Data":"7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d"} Dec 01 09:20:08 crc kubenswrapper[4763]: I1201 09:20:08.732334 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfqr" event={"ID":"52470a23-807a-48bd-968d-eb43cb36b804","Type":"ContainerStarted","Data":"6fdff6ab51f3568dd227d7b6304b779f1d7e21a57d2fd058dbbb946ed81802f2"} Dec 01 09:20:08 crc kubenswrapper[4763]: I1201 09:20:08.759806 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f454l" podStartSLOduration=2.080482169 podStartE2EDuration="4.759788696s" podCreationTimestamp="2025-12-01 09:20:04 +0000 UTC" firstStartedPulling="2025-12-01 09:20:05.661203558 +0000 UTC m=+322.929852326" lastFinishedPulling="2025-12-01 09:20:08.340510085 +0000 UTC m=+325.609158853" observedRunningTime="2025-12-01 09:20:08.744815429 +0000 UTC m=+326.013464197" watchObservedRunningTime="2025-12-01 09:20:08.759788696 +0000 UTC m=+326.028437464" Dec 01 09:20:08 crc kubenswrapper[4763]: I1201 09:20:08.761886 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xwfqr" podStartSLOduration=2.2653248599999998 podStartE2EDuration="4.761877783s" podCreationTimestamp="2025-12-01 09:20:04 +0000 UTC" firstStartedPulling="2025-12-01 09:20:05.657447714 +0000 UTC m=+322.926096482" lastFinishedPulling="2025-12-01 09:20:08.154000637 +0000 UTC m=+325.422649405" observedRunningTime="2025-12-01 09:20:08.758602662 +0000 UTC m=+326.027251440" watchObservedRunningTime="2025-12-01 09:20:08.761877783 +0000 UTC m=+326.030526551" Dec 01 09:20:08 crc kubenswrapper[4763]: I1201 09:20:08.999235 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5008358d-1fd7-4aa1-ab44-3aa248f575a5" path="/var/lib/kubelet/pods/5008358d-1fd7-4aa1-ab44-3aa248f575a5/volumes" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.077207 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k"] Dec 01 09:20:09 crc kubenswrapper[4763]: E1201 09:20:09.077427 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5008358d-1fd7-4aa1-ab44-3aa248f575a5" containerName="route-controller-manager" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.077440 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5008358d-1fd7-4aa1-ab44-3aa248f575a5" containerName="route-controller-manager" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.077551 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5008358d-1fd7-4aa1-ab44-3aa248f575a5" containerName="route-controller-manager" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.077882 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.080299 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.080498 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.080736 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.082387 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.082589 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.083602 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.098758 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k"] Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.207665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcjnc\" (UniqueName: \"kubernetes.io/projected/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-kube-api-access-zcjnc\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.207714 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-client-ca\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.207775 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-serving-cert\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.207805 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-config\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.308417 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-serving-cert\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.308493 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-config\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.308538 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjnc\" (UniqueName: \"kubernetes.io/projected/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-kube-api-access-zcjnc\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.308557 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-client-ca\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.309375 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-client-ca\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.310622 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-config\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.326699 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-serving-cert\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.329120 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjnc\" (UniqueName: \"kubernetes.io/projected/cbc4681b-c41e-4ffa-b884-bf63d1b4147e-kube-api-access-zcjnc\") pod \"route-controller-manager-7656c99f76-hm47k\" (UID: \"cbc4681b-c41e-4ffa-b884-bf63d1b4147e\") " pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.393345 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:09 crc kubenswrapper[4763]: I1201 09:20:09.816763 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k"] Dec 01 09:20:09 crc kubenswrapper[4763]: W1201 09:20:09.820580 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc4681b_c41e_4ffa_b884_bf63d1b4147e.slice/crio-18d0646b4fd4bc215aa41e23b7bb020a6418ee8c35cd3a248850824565275679 WatchSource:0}: Error finding container 18d0646b4fd4bc215aa41e23b7bb020a6418ee8c35cd3a248850824565275679: Status 404 returned error can't find the container with id 18d0646b4fd4bc215aa41e23b7bb020a6418ee8c35cd3a248850824565275679 Dec 01 09:20:10 crc kubenswrapper[4763]: I1201 09:20:10.747252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" event={"ID":"cbc4681b-c41e-4ffa-b884-bf63d1b4147e","Type":"ContainerStarted","Data":"462b9056486ede982fc44303d62750c5120c2748d270bedff096c64798c42c38"} Dec 01 09:20:10 crc kubenswrapper[4763]: I1201 09:20:10.747604 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:10 crc kubenswrapper[4763]: I1201 09:20:10.747620 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" event={"ID":"cbc4681b-c41e-4ffa-b884-bf63d1b4147e","Type":"ContainerStarted","Data":"18d0646b4fd4bc215aa41e23b7bb020a6418ee8c35cd3a248850824565275679"} Dec 01 09:20:10 crc kubenswrapper[4763]: I1201 09:20:10.752926 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" Dec 01 09:20:10 crc kubenswrapper[4763]: I1201 09:20:10.766011 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" podStartSLOduration=3.765991298 podStartE2EDuration="3.765991298s" podCreationTimestamp="2025-12-01 09:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:20:10.76400029 +0000 UTC m=+328.032649058" watchObservedRunningTime="2025-12-01 09:20:10.765991298 +0000 UTC m=+328.034640066" Dec 01 09:20:12 crc kubenswrapper[4763]: I1201 09:20:12.259703 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:12 crc kubenswrapper[4763]: I1201 09:20:12.259773 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:12 crc kubenswrapper[4763]: I1201 09:20:12.307757 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:12 crc kubenswrapper[4763]: I1201 09:20:12.447577 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:12 crc kubenswrapper[4763]: I1201 09:20:12.447830 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:12 crc kubenswrapper[4763]: I1201 09:20:12.489562 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:12 crc kubenswrapper[4763]: I1201 09:20:12.797826 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 09:20:12 crc kubenswrapper[4763]: I1201 09:20:12.802773 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hc4lf" Dec 01 09:20:14 crc kubenswrapper[4763]: I1201 09:20:14.676701 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:14 crc kubenswrapper[4763]: I1201 09:20:14.676753 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:14 crc kubenswrapper[4763]: I1201 09:20:14.719163 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:14 crc kubenswrapper[4763]: I1201 09:20:14.849862 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:14 crc kubenswrapper[4763]: I1201 09:20:14.849910 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:15 crc kubenswrapper[4763]: I1201 09:20:15.015116 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:15 crc kubenswrapper[4763]: I1201 09:20:15.015174 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f454l" Dec 01 09:20:15 crc kubenswrapper[4763]: I1201 09:20:15.806215 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xwfqr" Dec 01 09:20:22 crc kubenswrapper[4763]: I1201 09:20:22.384233 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-pnkw4" Dec 01 09:20:22 crc kubenswrapper[4763]: I1201 09:20:22.463361 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5rbk"] Dec 01 09:20:27 crc kubenswrapper[4763]: I1201 09:20:27.189206 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-699fc888d4-x74wv"] Dec 01 09:20:27 crc kubenswrapper[4763]: I1201 09:20:27.189711 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" podUID="2c059dd2-cdca-4f0f-9978-e4838fa5c325" containerName="controller-manager" containerID="cri-o://ac3d1dc31e5df3ea5cda89e1bb752e2721e963381bfa03ae1699d14e8a0dffc2" gracePeriod=30 Dec 01 09:20:27 crc kubenswrapper[4763]: I1201 09:20:27.856043 4763 generic.go:334] "Generic (PLEG): container finished" podID="2c059dd2-cdca-4f0f-9978-e4838fa5c325" containerID="ac3d1dc31e5df3ea5cda89e1bb752e2721e963381bfa03ae1699d14e8a0dffc2" exitCode=0 Dec 01 09:20:27 crc kubenswrapper[4763]: I1201 09:20:27.856156 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" event={"ID":"2c059dd2-cdca-4f0f-9978-e4838fa5c325","Type":"ContainerDied","Data":"ac3d1dc31e5df3ea5cda89e1bb752e2721e963381bfa03ae1699d14e8a0dffc2"} Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.170349 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.307514 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c059dd2-cdca-4f0f-9978-e4838fa5c325-serving-cert\") pod \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.307610 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-config\") pod \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.307708 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp9c4\" (UniqueName: \"kubernetes.io/projected/2c059dd2-cdca-4f0f-9978-e4838fa5c325-kube-api-access-rp9c4\") pod \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.307749 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-client-ca\") pod \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.307784 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-proxy-ca-bundles\") pod \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\" (UID: \"2c059dd2-cdca-4f0f-9978-e4838fa5c325\") " Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.308531 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-client-ca" (OuterVolumeSpecName: "client-ca") pod "2c059dd2-cdca-4f0f-9978-e4838fa5c325" (UID: "2c059dd2-cdca-4f0f-9978-e4838fa5c325"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.308695 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2c059dd2-cdca-4f0f-9978-e4838fa5c325" (UID: "2c059dd2-cdca-4f0f-9978-e4838fa5c325"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.310132 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-config" (OuterVolumeSpecName: "config") pod "2c059dd2-cdca-4f0f-9978-e4838fa5c325" (UID: "2c059dd2-cdca-4f0f-9978-e4838fa5c325"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.313766 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c059dd2-cdca-4f0f-9978-e4838fa5c325-kube-api-access-rp9c4" (OuterVolumeSpecName: "kube-api-access-rp9c4") pod "2c059dd2-cdca-4f0f-9978-e4838fa5c325" (UID: "2c059dd2-cdca-4f0f-9978-e4838fa5c325"). InnerVolumeSpecName "kube-api-access-rp9c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.322681 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c059dd2-cdca-4f0f-9978-e4838fa5c325-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c059dd2-cdca-4f0f-9978-e4838fa5c325" (UID: "2c059dd2-cdca-4f0f-9978-e4838fa5c325"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.409329 4763 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.409366 4763 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c059dd2-cdca-4f0f-9978-e4838fa5c325-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.409378 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.409390 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp9c4\" (UniqueName: \"kubernetes.io/projected/2c059dd2-cdca-4f0f-9978-e4838fa5c325-kube-api-access-rp9c4\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.409404 4763 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2c059dd2-cdca-4f0f-9978-e4838fa5c325-client-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.861928 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" event={"ID":"2c059dd2-cdca-4f0f-9978-e4838fa5c325","Type":"ContainerDied","Data":"8574e55aa7ad7f6b32766021fc71d447a7ad28a1f370fc4f8197376562a383ff"} Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.861972 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699fc888d4-x74wv" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.861979 4763 scope.go:117] "RemoveContainer" containerID="ac3d1dc31e5df3ea5cda89e1bb752e2721e963381bfa03ae1699d14e8a0dffc2" Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.896744 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-699fc888d4-x74wv"] Dec 01 09:20:28 crc kubenswrapper[4763]: I1201 09:20:28.899881 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-699fc888d4-x74wv"] Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.001156 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c059dd2-cdca-4f0f-9978-e4838fa5c325" path="/var/lib/kubelet/pods/2c059dd2-cdca-4f0f-9978-e4838fa5c325/volumes" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.093695 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-vnxbq"] Dec 01 09:20:29 crc kubenswrapper[4763]: E1201 09:20:29.094400 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c059dd2-cdca-4f0f-9978-e4838fa5c325" containerName="controller-manager" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.094413 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c059dd2-cdca-4f0f-9978-e4838fa5c325" containerName="controller-manager" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.094525 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c059dd2-cdca-4f0f-9978-e4838fa5c325" containerName="controller-manager" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.095404 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.097191 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.098894 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.099172 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.099335 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.099514 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.100398 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.106151 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-vnxbq"] Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.106296 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.242149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-client-ca\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.242199 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5273af4-4768-4ce2-a080-32bdcb3527cc-serving-cert\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.242225 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-proxy-ca-bundles\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.242271 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-config\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.242293 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng2r9\" (UniqueName: \"kubernetes.io/projected/e5273af4-4768-4ce2-a080-32bdcb3527cc-kube-api-access-ng2r9\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.343896 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-client-ca\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.343956 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5273af4-4768-4ce2-a080-32bdcb3527cc-serving-cert\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.343986 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-proxy-ca-bundles\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.344034 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-config\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.344058 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng2r9\" (UniqueName: \"kubernetes.io/projected/e5273af4-4768-4ce2-a080-32bdcb3527cc-kube-api-access-ng2r9\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.345861 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-client-ca\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.346043 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-proxy-ca-bundles\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.346797 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5273af4-4768-4ce2-a080-32bdcb3527cc-config\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.350132 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5273af4-4768-4ce2-a080-32bdcb3527cc-serving-cert\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.371382 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng2r9\" (UniqueName: \"kubernetes.io/projected/e5273af4-4768-4ce2-a080-32bdcb3527cc-kube-api-access-ng2r9\") pod \"controller-manager-884c54fcb-vnxbq\" (UID: \"e5273af4-4768-4ce2-a080-32bdcb3527cc\") " pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.463363 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.641727 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-884c54fcb-vnxbq"] Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.868250 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" event={"ID":"e5273af4-4768-4ce2-a080-32bdcb3527cc","Type":"ContainerStarted","Data":"8e1ea3e6cc9da4cb5e9efd7dfd2a5c4e37c9797c5e580adcb18cf5cdd64e37ac"} Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.868637 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.868653 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" event={"ID":"e5273af4-4768-4ce2-a080-32bdcb3527cc","Type":"ContainerStarted","Data":"25686718e342e8368d23a4829976a5032d888357ea1a91d3ab05016fa2e987a3"} Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.873414 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" Dec 01 09:20:29 crc kubenswrapper[4763]: I1201 09:20:29.884694 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" podStartSLOduration=2.884673324 podStartE2EDuration="2.884673324s" podCreationTimestamp="2025-12-01 09:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:20:29.880766347 +0000 UTC m=+347.149415115" watchObservedRunningTime="2025-12-01 09:20:29.884673324 +0000 UTC m=+347.153322092" Dec 01 09:20:33 crc kubenswrapper[4763]: I1201 09:20:33.929072 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:20:33 crc kubenswrapper[4763]: I1201 09:20:33.929688 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:20:47 crc kubenswrapper[4763]: I1201 09:20:47.522311 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" podUID="a41fc022-b655-4924-8b6a-dd3cd87ef9ba" containerName="registry" containerID="cri-o://59d35de8991a1e7d17b821d9c7f00f3b17921c43666c1735b1eb24f77d9f05e3" gracePeriod=30 Dec 01 09:20:47 crc kubenswrapper[4763]: I1201 09:20:47.969916 4763 generic.go:334] "Generic (PLEG): container finished" podID="a41fc022-b655-4924-8b6a-dd3cd87ef9ba" containerID="59d35de8991a1e7d17b821d9c7f00f3b17921c43666c1735b1eb24f77d9f05e3" exitCode=0 Dec 01 09:20:47 crc kubenswrapper[4763]: I1201 09:20:47.969956 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" event={"ID":"a41fc022-b655-4924-8b6a-dd3cd87ef9ba","Type":"ContainerDied","Data":"59d35de8991a1e7d17b821d9c7f00f3b17921c43666c1735b1eb24f77d9f05e3"} Dec 01 09:20:47 crc kubenswrapper[4763]: I1201 09:20:47.969986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" event={"ID":"a41fc022-b655-4924-8b6a-dd3cd87ef9ba","Type":"ContainerDied","Data":"133cf3bbf28e832226d57ae969b11fcd09de3bd7bf4d61bfe661fae22295fe37"} Dec 01 09:20:47 crc kubenswrapper[4763]: I1201 09:20:47.969999 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133cf3bbf28e832226d57ae969b11fcd09de3bd7bf4d61bfe661fae22295fe37" Dec 01 09:20:47 crc kubenswrapper[4763]: I1201 09:20:47.992253 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.096121 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-installation-pull-secrets\") pod \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.096198 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-ca-trust-extracted\") pod \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.096239 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-bound-sa-token\") pod \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.096263 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-certificates\") pod \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.096360 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-trusted-ca\") pod \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.096390 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-tls\") pod \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.096423 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l66sm\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-kube-api-access-l66sm\") pod \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.097015 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a41fc022-b655-4924-8b6a-dd3cd87ef9ba" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.097093 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a41fc022-b655-4924-8b6a-dd3cd87ef9ba" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.097303 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\" (UID: \"a41fc022-b655-4924-8b6a-dd3cd87ef9ba\") " Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.097815 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.097842 4763 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.101241 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a41fc022-b655-4924-8b6a-dd3cd87ef9ba" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.105337 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-kube-api-access-l66sm" (OuterVolumeSpecName: "kube-api-access-l66sm") pod "a41fc022-b655-4924-8b6a-dd3cd87ef9ba" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba"). InnerVolumeSpecName "kube-api-access-l66sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.105535 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a41fc022-b655-4924-8b6a-dd3cd87ef9ba" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.106117 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a41fc022-b655-4924-8b6a-dd3cd87ef9ba" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.106976 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a41fc022-b655-4924-8b6a-dd3cd87ef9ba" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.114007 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a41fc022-b655-4924-8b6a-dd3cd87ef9ba" (UID: "a41fc022-b655-4924-8b6a-dd3cd87ef9ba"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.199425 4763 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.199474 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l66sm\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-kube-api-access-l66sm\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.199711 4763 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.199719 4763 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.199728 4763 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a41fc022-b655-4924-8b6a-dd3cd87ef9ba-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 01 09:20:48 crc kubenswrapper[4763]: I1201 09:20:48.976954 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-v5rbk" Dec 01 09:20:49 crc kubenswrapper[4763]: I1201 09:20:49.026007 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5rbk"] Dec 01 09:20:49 crc kubenswrapper[4763]: I1201 09:20:49.032630 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-v5rbk"] Dec 01 09:20:51 crc kubenswrapper[4763]: I1201 09:20:51.000511 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41fc022-b655-4924-8b6a-dd3cd87ef9ba" path="/var/lib/kubelet/pods/a41fc022-b655-4924-8b6a-dd3cd87ef9ba/volumes" Dec 01 09:21:03 crc kubenswrapper[4763]: I1201 09:21:03.929782 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:21:03 crc kubenswrapper[4763]: I1201 09:21:03.930398 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:21:33 crc kubenswrapper[4763]: I1201 09:21:33.929767 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:21:33 crc kubenswrapper[4763]: I1201 09:21:33.930236 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:21:33 crc kubenswrapper[4763]: I1201 09:21:33.930281 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:21:33 crc kubenswrapper[4763]: I1201 09:21:33.930831 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e04da687e5f490cd7c72b41a50dd42a262635c23ccef8396950a220f1c76c07c"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:21:33 crc kubenswrapper[4763]: I1201 09:21:33.930882 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://e04da687e5f490cd7c72b41a50dd42a262635c23ccef8396950a220f1c76c07c" gracePeriod=600 Dec 01 09:21:34 crc kubenswrapper[4763]: I1201 09:21:34.228326 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="e04da687e5f490cd7c72b41a50dd42a262635c23ccef8396950a220f1c76c07c" exitCode=0 Dec 01 09:21:34 crc kubenswrapper[4763]: I1201 09:21:34.228402 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"e04da687e5f490cd7c72b41a50dd42a262635c23ccef8396950a220f1c76c07c"} Dec 01 09:21:34 crc kubenswrapper[4763]: I1201 09:21:34.228716 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"2225c7eb7487a6ea4c1fbbf2365ffe08cfe7e24776ec45b7e270f3787e713463"} Dec 01 09:21:34 crc kubenswrapper[4763]: I1201 09:21:34.228731 4763 scope.go:117] "RemoveContainer" containerID="3906a5f5363a6f9df7a54c6837261015cb19cb3e223c1d4dc2c1fcede33835af" Dec 01 09:21:43 crc kubenswrapper[4763]: I1201 09:21:43.175690 4763 scope.go:117] "RemoveContainer" containerID="7f0f7aed7589fa0af3dc70a8ecfe1c0dd8052d13b96b19e885eae45fcb2af138" Dec 01 09:21:43 crc kubenswrapper[4763]: I1201 09:21:43.189018 4763 scope.go:117] "RemoveContainer" containerID="150bb4d62ee9aa5859d4b785036e59266ddd00ab07d9d08d7693dbb0ccb66644" Dec 01 09:21:43 crc kubenswrapper[4763]: I1201 09:21:43.204919 4763 scope.go:117] "RemoveContainer" containerID="966d871470c191c4477d865373d9f474107f38ac32caea3a4d298f2fbb3d776b" Dec 01 09:21:43 crc kubenswrapper[4763]: I1201 09:21:43.221990 4763 scope.go:117] "RemoveContainer" containerID="482f04350e2e0671c1c696866912fdb91de2aed4f746b078f568b101a5888422" Dec 01 09:21:43 crc kubenswrapper[4763]: I1201 09:21:43.235718 4763 scope.go:117] "RemoveContainer" containerID="497e30acccf80b21bd12ecef38126dbd8160cd12415c93ecc890c760d6840c8c" Dec 01 09:21:43 crc kubenswrapper[4763]: I1201 09:21:43.261631 4763 scope.go:117] "RemoveContainer" containerID="c9712fa2bcf3b3a35aa0bfc9a9a6a7a4a4d66f62fcc2bbb3fc7768d7f6a66c7b" Dec 01 09:23:43 crc kubenswrapper[4763]: I1201 09:23:43.308749 4763 scope.go:117] "RemoveContainer" containerID="59d35de8991a1e7d17b821d9c7f00f3b17921c43666c1735b1eb24f77d9f05e3" Dec 01 09:23:43 crc kubenswrapper[4763]: I1201 09:23:43.327710 4763 scope.go:117] "RemoveContainer" containerID="5457caff952e4a23d58b648667e2f9f4917f7e55ee076208a69a333323e430f2" Dec 01 09:23:43 crc kubenswrapper[4763]: I1201 09:23:43.347375 4763 scope.go:117] "RemoveContainer" containerID="7ac6af7c882815e3b8611f5d2ef5e310e87561295f3e3f82722aacb37b0f2513" Dec 01 09:24:03 crc kubenswrapper[4763]: I1201 09:24:03.929823 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:24:03 crc kubenswrapper[4763]: I1201 09:24:03.930519 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:24:33 crc kubenswrapper[4763]: I1201 09:24:33.929945 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:24:34 crc kubenswrapper[4763]: I1201 09:24:33.931433 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:25:03 crc kubenswrapper[4763]: I1201 09:25:03.929477 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:25:03 crc kubenswrapper[4763]: I1201 09:25:03.931203 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:25:03 crc kubenswrapper[4763]: I1201 09:25:03.931391 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:25:03 crc kubenswrapper[4763]: I1201 09:25:03.932203 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2225c7eb7487a6ea4c1fbbf2365ffe08cfe7e24776ec45b7e270f3787e713463"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:25:03 crc kubenswrapper[4763]: I1201 09:25:03.932368 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://2225c7eb7487a6ea4c1fbbf2365ffe08cfe7e24776ec45b7e270f3787e713463" gracePeriod=600 Dec 01 09:25:04 crc kubenswrapper[4763]: I1201 09:25:04.411593 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="2225c7eb7487a6ea4c1fbbf2365ffe08cfe7e24776ec45b7e270f3787e713463" exitCode=0 Dec 01 09:25:04 crc kubenswrapper[4763]: I1201 09:25:04.411729 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"2225c7eb7487a6ea4c1fbbf2365ffe08cfe7e24776ec45b7e270f3787e713463"} Dec 01 09:25:04 crc kubenswrapper[4763]: I1201 09:25:04.412015 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"d9e0e5adb882a530747c6596a975101cf0f536a3cb28e48dd137e2024a6a05f6"} Dec 01 09:25:04 crc kubenswrapper[4763]: I1201 09:25:04.412043 4763 scope.go:117] "RemoveContainer" containerID="e04da687e5f490cd7c72b41a50dd42a262635c23ccef8396950a220f1c76c07c" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.241059 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tq2xd"] Dec 01 09:25:32 crc kubenswrapper[4763]: E1201 09:25:32.241774 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41fc022-b655-4924-8b6a-dd3cd87ef9ba" containerName="registry" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.241786 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41fc022-b655-4924-8b6a-dd3cd87ef9ba" containerName="registry" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.241883 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41fc022-b655-4924-8b6a-dd3cd87ef9ba" containerName="registry" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.242228 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-tq2xd" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.245500 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.246002 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.246259 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tq2xd"] Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.247121 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-67vxs" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.254741 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-s2hg7"] Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.255625 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-s2hg7" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.278958 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8d4zz" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.280715 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mjbm5"] Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.281361 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.284588 4763 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-gz8wl" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.305868 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-s2hg7"] Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.313734 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mjbm5"] Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.415438 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wpc5\" (UniqueName: \"kubernetes.io/projected/99f9aa79-5bae-4215-b137-baef58e56e96-kube-api-access-9wpc5\") pod \"cert-manager-cainjector-7f985d654d-tq2xd\" (UID: \"99f9aa79-5bae-4215-b137-baef58e56e96\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tq2xd" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.415491 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft67l\" (UniqueName: \"kubernetes.io/projected/8c962037-246b-4727-8aab-6632e2e9e5f7-kube-api-access-ft67l\") pod \"cert-manager-webhook-5655c58dd6-mjbm5\" (UID: \"8c962037-246b-4727-8aab-6632e2e9e5f7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.415509 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmz8b\" (UniqueName: \"kubernetes.io/projected/f4a5555c-4f44-4a2e-a9bf-6daab8490e32-kube-api-access-nmz8b\") pod \"cert-manager-5b446d88c5-s2hg7\" (UID: \"f4a5555c-4f44-4a2e-a9bf-6daab8490e32\") " pod="cert-manager/cert-manager-5b446d88c5-s2hg7" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.516664 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wpc5\" (UniqueName: \"kubernetes.io/projected/99f9aa79-5bae-4215-b137-baef58e56e96-kube-api-access-9wpc5\") pod \"cert-manager-cainjector-7f985d654d-tq2xd\" (UID: \"99f9aa79-5bae-4215-b137-baef58e56e96\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tq2xd" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.516702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft67l\" (UniqueName: \"kubernetes.io/projected/8c962037-246b-4727-8aab-6632e2e9e5f7-kube-api-access-ft67l\") pod \"cert-manager-webhook-5655c58dd6-mjbm5\" (UID: \"8c962037-246b-4727-8aab-6632e2e9e5f7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.516727 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmz8b\" (UniqueName: \"kubernetes.io/projected/f4a5555c-4f44-4a2e-a9bf-6daab8490e32-kube-api-access-nmz8b\") pod \"cert-manager-5b446d88c5-s2hg7\" (UID: \"f4a5555c-4f44-4a2e-a9bf-6daab8490e32\") " pod="cert-manager/cert-manager-5b446d88c5-s2hg7" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.536215 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wpc5\" (UniqueName: \"kubernetes.io/projected/99f9aa79-5bae-4215-b137-baef58e56e96-kube-api-access-9wpc5\") pod \"cert-manager-cainjector-7f985d654d-tq2xd\" (UID: \"99f9aa79-5bae-4215-b137-baef58e56e96\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-tq2xd" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.536268 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmz8b\" (UniqueName: \"kubernetes.io/projected/f4a5555c-4f44-4a2e-a9bf-6daab8490e32-kube-api-access-nmz8b\") pod \"cert-manager-5b446d88c5-s2hg7\" (UID: \"f4a5555c-4f44-4a2e-a9bf-6daab8490e32\") " pod="cert-manager/cert-manager-5b446d88c5-s2hg7" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.536592 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft67l\" (UniqueName: \"kubernetes.io/projected/8c962037-246b-4727-8aab-6632e2e9e5f7-kube-api-access-ft67l\") pod \"cert-manager-webhook-5655c58dd6-mjbm5\" (UID: \"8c962037-246b-4727-8aab-6632e2e9e5f7\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.569479 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-tq2xd" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.605769 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-s2hg7" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.625689 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.914496 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-s2hg7"] Dec 01 09:25:32 crc kubenswrapper[4763]: I1201 09:25:32.920509 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:25:33 crc kubenswrapper[4763]: I1201 09:25:33.060356 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-tq2xd"] Dec 01 09:25:33 crc kubenswrapper[4763]: W1201 09:25:33.063793 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99f9aa79_5bae_4215_b137_baef58e56e96.slice/crio-2b219ee6e89b440c38297a8de2b42fc0d0b885617c0546a66af9f334251d7423 WatchSource:0}: Error finding container 2b219ee6e89b440c38297a8de2b42fc0d0b885617c0546a66af9f334251d7423: Status 404 returned error can't find the container with id 2b219ee6e89b440c38297a8de2b42fc0d0b885617c0546a66af9f334251d7423 Dec 01 09:25:33 crc kubenswrapper[4763]: I1201 09:25:33.091578 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-mjbm5"] Dec 01 09:25:33 crc kubenswrapper[4763]: W1201 09:25:33.098563 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c962037_246b_4727_8aab_6632e2e9e5f7.slice/crio-7afe1a67e23023bec304193b014a3f424cb15fae13a101745e44cb0570e82de4 WatchSource:0}: Error finding container 7afe1a67e23023bec304193b014a3f424cb15fae13a101745e44cb0570e82de4: Status 404 returned error can't find the container with id 7afe1a67e23023bec304193b014a3f424cb15fae13a101745e44cb0570e82de4 Dec 01 09:25:33 crc kubenswrapper[4763]: I1201 09:25:33.602556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" event={"ID":"8c962037-246b-4727-8aab-6632e2e9e5f7","Type":"ContainerStarted","Data":"7afe1a67e23023bec304193b014a3f424cb15fae13a101745e44cb0570e82de4"} Dec 01 09:25:33 crc kubenswrapper[4763]: I1201 09:25:33.603327 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-tq2xd" event={"ID":"99f9aa79-5bae-4215-b137-baef58e56e96","Type":"ContainerStarted","Data":"2b219ee6e89b440c38297a8de2b42fc0d0b885617c0546a66af9f334251d7423"} Dec 01 09:25:33 crc kubenswrapper[4763]: I1201 09:25:33.604144 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-s2hg7" event={"ID":"f4a5555c-4f44-4a2e-a9bf-6daab8490e32","Type":"ContainerStarted","Data":"e6706c417f603d94a73871bd8dec646bc9bfbbc86b147542c9a4dcf1846cdbd1"} Dec 01 09:25:35 crc kubenswrapper[4763]: I1201 09:25:35.614179 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-s2hg7" event={"ID":"f4a5555c-4f44-4a2e-a9bf-6daab8490e32","Type":"ContainerStarted","Data":"f2fe77ae3d7fecc877219675162428bbfc773ab0c4488b2f6a37bfe4570fcdb2"} Dec 01 09:25:35 crc kubenswrapper[4763]: I1201 09:25:35.627770 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-s2hg7" podStartSLOduration=1.458322539 podStartE2EDuration="3.627749251s" podCreationTimestamp="2025-12-01 09:25:32 +0000 UTC" firstStartedPulling="2025-12-01 09:25:32.920205304 +0000 UTC m=+650.188854072" lastFinishedPulling="2025-12-01 09:25:35.089632016 +0000 UTC m=+652.358280784" observedRunningTime="2025-12-01 09:25:35.62536869 +0000 UTC m=+652.894017458" watchObservedRunningTime="2025-12-01 09:25:35.627749251 +0000 UTC m=+652.896398019" Dec 01 09:25:37 crc kubenswrapper[4763]: I1201 09:25:37.627802 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" event={"ID":"8c962037-246b-4727-8aab-6632e2e9e5f7","Type":"ContainerStarted","Data":"9e7995203827cce78d7037949ef88c0747da4f8783a30a0178d5e1137df5a0b3"} Dec 01 09:25:37 crc kubenswrapper[4763]: I1201 09:25:37.628025 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" Dec 01 09:25:37 crc kubenswrapper[4763]: I1201 09:25:37.629963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-tq2xd" event={"ID":"99f9aa79-5bae-4215-b137-baef58e56e96","Type":"ContainerStarted","Data":"9d9539bc79ac263d77cc76ee9670b04f8a2325c8f89724d468764ee72f30d772"} Dec 01 09:25:37 crc kubenswrapper[4763]: I1201 09:25:37.650182 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" podStartSLOduration=1.9699409 podStartE2EDuration="5.650163257s" podCreationTimestamp="2025-12-01 09:25:32 +0000 UTC" firstStartedPulling="2025-12-01 09:25:33.101133086 +0000 UTC m=+650.369781844" lastFinishedPulling="2025-12-01 09:25:36.781355423 +0000 UTC m=+654.050004201" observedRunningTime="2025-12-01 09:25:37.644418014 +0000 UTC m=+654.913066782" watchObservedRunningTime="2025-12-01 09:25:37.650163257 +0000 UTC m=+654.918812045" Dec 01 09:25:37 crc kubenswrapper[4763]: I1201 09:25:37.666684 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-tq2xd" podStartSLOduration=1.9507900089999999 podStartE2EDuration="5.666656151s" podCreationTimestamp="2025-12-01 09:25:32 +0000 UTC" firstStartedPulling="2025-12-01 09:25:33.065503512 +0000 UTC m=+650.334152280" lastFinishedPulling="2025-12-01 09:25:36.781369654 +0000 UTC m=+654.050018422" observedRunningTime="2025-12-01 09:25:37.664030923 +0000 UTC m=+654.932679701" watchObservedRunningTime="2025-12-01 09:25:37.666656151 +0000 UTC m=+654.935304939" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.339732 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rpg27"] Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.341013 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovn-controller" containerID="cri-o://184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2" gracePeriod=30 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.341029 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="northd" containerID="cri-o://c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12" gracePeriod=30 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.341180 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24" gracePeriod=30 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.341203 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="sbdb" containerID="cri-o://ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d" gracePeriod=30 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.341241 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kube-rbac-proxy-node" containerID="cri-o://88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829" gracePeriod=30 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.341265 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="nbdb" containerID="cri-o://d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302" gracePeriod=30 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.341297 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovn-acl-logging" containerID="cri-o://6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5" gracePeriod=30 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.396009 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" containerID="cri-o://bb05ca306384e4013c5fc1b1f221725c94ed7d5bc1c9a6d8893fafd9ab0449df" gracePeriod=30 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.629517 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-mjbm5" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.658927 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/2.log" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.659533 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/1.log" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.659575 4763 generic.go:334] "Generic (PLEG): container finished" podID="192e1ecd-fa1f-4227-a40c-4f7773682880" containerID="f7559b28f34a26d39e47f026311b6e84e5ea88b4fa9d864b01d93cf5a16b187e" exitCode=2 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.659632 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr552" event={"ID":"192e1ecd-fa1f-4227-a40c-4f7773682880","Type":"ContainerDied","Data":"f7559b28f34a26d39e47f026311b6e84e5ea88b4fa9d864b01d93cf5a16b187e"} Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.659676 4763 scope.go:117] "RemoveContainer" containerID="92bbcf2c85f7609d564c0b4d13941caa2621b0ba7ce7f60940642227d10c0705" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.660211 4763 scope.go:117] "RemoveContainer" containerID="f7559b28f34a26d39e47f026311b6e84e5ea88b4fa9d864b01d93cf5a16b187e" Dec 01 09:25:42 crc kubenswrapper[4763]: E1201 09:25:42.660400 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-fr552_openshift-multus(192e1ecd-fa1f-4227-a40c-4f7773682880)\"" pod="openshift-multus/multus-fr552" podUID="192e1ecd-fa1f-4227-a40c-4f7773682880" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.667376 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovnkube-controller/3.log" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.670652 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovn-acl-logging/0.log" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671088 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovn-controller/0.log" Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671644 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="bb05ca306384e4013c5fc1b1f221725c94ed7d5bc1c9a6d8893fafd9ab0449df" exitCode=0 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671667 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12" exitCode=0 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671678 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24" exitCode=0 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671688 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829" exitCode=0 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671697 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5" exitCode=143 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671708 4763 generic.go:334] "Generic (PLEG): container finished" podID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerID="184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2" exitCode=143 Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671730 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"bb05ca306384e4013c5fc1b1f221725c94ed7d5bc1c9a6d8893fafd9ab0449df"} Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671760 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12"} Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671776 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24"} Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829"} Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671801 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5"} Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.671812 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2"} Dec 01 09:25:42 crc kubenswrapper[4763]: I1201 09:25:42.739607 4763 scope.go:117] "RemoveContainer" containerID="de6e64bfd272382d712bffa0f8236236bb8694078373592c6bc4417644ee9ee3" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.048908 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovn-acl-logging/0.log" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.049841 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpg27_e57a17bb-0609-4f45-ac9a-af60af65cdd9/ovn-controller/0.log" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.050731 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124270 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8tw7k"] Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124597 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kubecfg-setup" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124623 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kubecfg-setup" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124639 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovn-acl-logging" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124650 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovn-acl-logging" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124665 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="sbdb" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124678 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="sbdb" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124719 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124731 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124743 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovn-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124754 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovn-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124768 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124778 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124801 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="nbdb" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124811 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="nbdb" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124824 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124835 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124850 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="northd" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124861 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="northd" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124877 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124887 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.124902 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kube-rbac-proxy-node" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.124913 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kube-rbac-proxy-node" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125041 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kube-rbac-proxy-ovn-metrics" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125058 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="sbdb" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125067 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="kube-rbac-proxy-node" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125076 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125086 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="nbdb" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125096 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125107 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovn-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125115 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="northd" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125125 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125134 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovn-acl-logging" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125145 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.125533 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125545 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: E1201 09:25:43.125558 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125568 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.125700 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" containerName="ovnkube-controller" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.127987 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-openvswitch\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159412 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzr9g\" (UniqueName: \"kubernetes.io/projected/e57a17bb-0609-4f45-ac9a-af60af65cdd9-kube-api-access-wzr9g\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159322 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159578 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-netd\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159688 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-netns\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159747 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-etc-openvswitch\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159791 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-ovn-kubernetes\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159819 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-var-lib-openvswitch\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159841 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159845 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159871 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159896 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159795 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159919 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-kubelet\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159934 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159950 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-env-overrides\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159955 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159985 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-node-log\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-config\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160061 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovn-node-metrics-cert\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160097 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-systemd-units\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160134 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-script-lib\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160171 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-systemd\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160202 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-log-socket\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160250 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-bin\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160277 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-slash\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160316 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-ovn\") pod \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\" (UID: \"e57a17bb-0609-4f45-ac9a-af60af65cdd9\") " Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160062 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-node-log" (OuterVolumeSpecName: "node-log") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160520 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160667 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160565 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160584 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-log-socket" (OuterVolumeSpecName: "log-socket") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160618 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160759 4763 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160773 4763 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160782 4763 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160792 4763 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160802 4763 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160812 4763 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160822 4763 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160834 4763 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.160842 4763 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-node-log\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.159702 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.161245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-slash" (OuterVolumeSpecName: "host-slash") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.161329 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.161802 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.164203 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57a17bb-0609-4f45-ac9a-af60af65cdd9-kube-api-access-wzr9g" (OuterVolumeSpecName: "kube-api-access-wzr9g") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "kube-api-access-wzr9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.164700 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.171737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e57a17bb-0609-4f45-ac9a-af60af65cdd9" (UID: "e57a17bb-0609-4f45-ac9a-af60af65cdd9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.261590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-cni-bin\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.261822 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-env-overrides\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.261959 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovnkube-config\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262041 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262108 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovn-node-metrics-cert\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262154 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovnkube-script-lib\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262234 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-systemd-units\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262307 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-node-log\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262386 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-etc-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262449 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-ovn\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262543 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-systemd\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262602 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-cni-netd\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262671 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkk4\" (UniqueName: \"kubernetes.io/projected/d86e127d-5d15-470a-bc56-2ecde45ef4fe-kube-api-access-6hkk4\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-var-lib-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.262944 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-log-socket\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263011 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-kubelet\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263069 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-run-netns\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263138 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-slash\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263232 4763 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263287 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzr9g\" (UniqueName: \"kubernetes.io/projected/e57a17bb-0609-4f45-ac9a-af60af65cdd9-kube-api-access-wzr9g\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263335 4763 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263387 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263434 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263510 4763 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263559 4763 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e57a17bb-0609-4f45-ac9a-af60af65cdd9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263605 4763 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263661 4763 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-log-socket\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263708 4763 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-slash\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.263786 4763 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e57a17bb-0609-4f45-ac9a-af60af65cdd9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364745 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-systemd\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364783 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-cni-netd\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364803 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364828 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364857 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkk4\" (UniqueName: \"kubernetes.io/projected/d86e127d-5d15-470a-bc56-2ecde45ef4fe-kube-api-access-6hkk4\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364874 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-var-lib-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364890 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-log-socket\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364907 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-kubelet\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364864 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-systemd\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364953 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-run-netns\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365005 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-var-lib-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.364922 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-run-netns\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365025 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-log-socket\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365048 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-kubelet\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365071 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365084 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365098 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-cni-netd\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365354 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-slash\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365387 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-slash\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365399 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-cni-bin\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365442 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-env-overrides\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365620 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovnkube-config\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365664 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365734 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovn-node-metrics-cert\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365777 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovnkube-script-lib\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365820 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-systemd-units\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365856 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-node-log\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-etc-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365931 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-ovn\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-env-overrides\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.365495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-cni-bin\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.366512 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-etc-openvswitch\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.366539 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-node-log\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.366568 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.366572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-run-ovn\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.366597 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d86e127d-5d15-470a-bc56-2ecde45ef4fe-systemd-units\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.366933 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovnkube-config\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.367320 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovnkube-script-lib\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.370088 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d86e127d-5d15-470a-bc56-2ecde45ef4fe-ovn-node-metrics-cert\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.382773 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkk4\" (UniqueName: \"kubernetes.io/projected/d86e127d-5d15-470a-bc56-2ecde45ef4fe-kube-api-access-6hkk4\") pod \"ovnkube-node-8tw7k\" (UID: \"d86e127d-5d15-470a-bc56-2ecde45ef4fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.391238 4763 scope.go:117] "RemoveContainer" containerID="c032f5504108cbf9967a58bfa3c2a435644e2d98b41e53b6c30c24b60921fa12" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.405264 4763 scope.go:117] "RemoveContainer" containerID="bb05ca306384e4013c5fc1b1f221725c94ed7d5bc1c9a6d8893fafd9ab0449df" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.419017 4763 scope.go:117] "RemoveContainer" containerID="88481e6184aa4b2fce5f7c28249a921086a86c543f8de779a0c237ef106ae829" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.429931 4763 scope.go:117] "RemoveContainer" containerID="ba4cd92e2799a619a2c2623b3141d81597f777a9b78396470395f8187ec72ec1" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.441403 4763 scope.go:117] "RemoveContainer" containerID="ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.445009 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.458191 4763 scope.go:117] "RemoveContainer" containerID="6428ac317b2f3db0639da659930ec4adc8fe3799c8c36faa13fd48e5f5b83ca5" Dec 01 09:25:43 crc kubenswrapper[4763]: W1201 09:25:43.466565 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd86e127d_5d15_470a_bc56_2ecde45ef4fe.slice/crio-af318320738f2531b7a4f9c499fa5aa72277976abff5f3ec9a990c785971accd WatchSource:0}: Error finding container af318320738f2531b7a4f9c499fa5aa72277976abff5f3ec9a990c785971accd: Status 404 returned error can't find the container with id af318320738f2531b7a4f9c499fa5aa72277976abff5f3ec9a990c785971accd Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.532564 4763 scope.go:117] "RemoveContainer" containerID="d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.556135 4763 scope.go:117] "RemoveContainer" containerID="73f7e15dc726fddbaf6da0f7ac69bb453d0934ca27df470760e0e59ea67f2d24" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.586673 4763 scope.go:117] "RemoveContainer" containerID="184d6ca286aff84d8bd607374737c1167d9f552141429f626dccc454feda6cf2" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.677994 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/2.log" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.680505 4763 generic.go:334] "Generic (PLEG): container finished" podID="d86e127d-5d15-470a-bc56-2ecde45ef4fe" containerID="e0b6d86dba5c6c35c0fec7984f8c2c42c7ecb32ea21e45c8a95f437817762360" exitCode=0 Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.680614 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.680622 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerDied","Data":"e0b6d86dba5c6c35c0fec7984f8c2c42c7ecb32ea21e45c8a95f437817762360"} Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.680687 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"af318320738f2531b7a4f9c499fa5aa72277976abff5f3ec9a990c785971accd"} Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.680707 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"ed1789ba421087260e16ab5a8945938412a85421cba0a18cd42c15e30c5d009d"} Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.680770 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"d84f6adb91a9876925656821444f8bff18d2a45612526eb593832b43a3d92302"} Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.680792 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpg27" event={"ID":"e57a17bb-0609-4f45-ac9a-af60af65cdd9","Type":"ContainerDied","Data":"56930ca0aecb1732acbbc62e6b3478c1db76f269169a9eaebd00603d68907eb0"} Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.758568 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rpg27"] Dec 01 09:25:43 crc kubenswrapper[4763]: I1201 09:25:43.761396 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rpg27"] Dec 01 09:25:44 crc kubenswrapper[4763]: I1201 09:25:44.689147 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"95c4ef47361c7fa597bebc0d936a0977de515676599c47bd3b69620223fe8b5f"} Dec 01 09:25:44 crc kubenswrapper[4763]: I1201 09:25:44.689480 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"ceff9264c25614daef6c1f3180baccbbb2263c74f0f05c447827eeb4fec88675"} Dec 01 09:25:44 crc kubenswrapper[4763]: I1201 09:25:44.689497 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"4534f5c5bb30f321d0cac7db49912af41f3e8cf46ad7fcd45600ff69212d9cc7"} Dec 01 09:25:44 crc kubenswrapper[4763]: I1201 09:25:44.689507 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"e765995bfb8b816855e612c6b2fcc077a0fe85dcd3cfb0645b7a6ebae68d7e41"} Dec 01 09:25:44 crc kubenswrapper[4763]: I1201 09:25:44.689517 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"99e50aa2e580a7dd597925437550b5f83a896168eec3882631e8057ee3ac81a6"} Dec 01 09:25:44 crc kubenswrapper[4763]: I1201 09:25:44.689526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"b6013fce2eb12dc708b4cff042073caaa015899d8d7f7661d24cd15a57d39e4b"} Dec 01 09:25:45 crc kubenswrapper[4763]: I1201 09:25:45.001488 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57a17bb-0609-4f45-ac9a-af60af65cdd9" path="/var/lib/kubelet/pods/e57a17bb-0609-4f45-ac9a-af60af65cdd9/volumes" Dec 01 09:25:46 crc kubenswrapper[4763]: I1201 09:25:46.704474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"11a8a7a45db5e1695f374d37da42985d262b59f63221c1a71fe82a75d2670996"} Dec 01 09:25:49 crc kubenswrapper[4763]: I1201 09:25:49.728626 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" event={"ID":"d86e127d-5d15-470a-bc56-2ecde45ef4fe","Type":"ContainerStarted","Data":"45f2ad028bcfdbdf151f86af1d14d41e2dd924fb0cfa80c3acc0a6698326ca22"} Dec 01 09:25:49 crc kubenswrapper[4763]: I1201 09:25:49.730198 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:49 crc kubenswrapper[4763]: I1201 09:25:49.730224 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:49 crc kubenswrapper[4763]: I1201 09:25:49.730235 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:49 crc kubenswrapper[4763]: I1201 09:25:49.761612 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:49 crc kubenswrapper[4763]: I1201 09:25:49.762699 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" podStartSLOduration=6.762683388 podStartE2EDuration="6.762683388s" podCreationTimestamp="2025-12-01 09:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:25:49.761105602 +0000 UTC m=+667.029754390" watchObservedRunningTime="2025-12-01 09:25:49.762683388 +0000 UTC m=+667.031332156" Dec 01 09:25:49 crc kubenswrapper[4763]: I1201 09:25:49.771658 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:25:52 crc kubenswrapper[4763]: I1201 09:25:52.997065 4763 scope.go:117] "RemoveContainer" containerID="f7559b28f34a26d39e47f026311b6e84e5ea88b4fa9d864b01d93cf5a16b187e" Dec 01 09:25:52 crc kubenswrapper[4763]: E1201 09:25:52.998755 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-fr552_openshift-multus(192e1ecd-fa1f-4227-a40c-4f7773682880)\"" pod="openshift-multus/multus-fr552" podUID="192e1ecd-fa1f-4227-a40c-4f7773682880" Dec 01 09:26:06 crc kubenswrapper[4763]: I1201 09:26:06.994739 4763 scope.go:117] "RemoveContainer" containerID="f7559b28f34a26d39e47f026311b6e84e5ea88b4fa9d864b01d93cf5a16b187e" Dec 01 09:26:07 crc kubenswrapper[4763]: I1201 09:26:07.822710 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fr552_192e1ecd-fa1f-4227-a40c-4f7773682880/kube-multus/2.log" Dec 01 09:26:07 crc kubenswrapper[4763]: I1201 09:26:07.823304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fr552" event={"ID":"192e1ecd-fa1f-4227-a40c-4f7773682880","Type":"ContainerStarted","Data":"246a1fe90a551bccbf713d34c6d10e1f566681a3362d5de5d7a5ce58eb1f150e"} Dec 01 09:26:13 crc kubenswrapper[4763]: I1201 09:26:13.476231 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8tw7k" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.677306 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n"] Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.678899 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.681579 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.688717 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n"] Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.783951 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgb84\" (UniqueName: \"kubernetes.io/projected/427c5d0e-a085-4795-9df8-47584898bc8c-kube-api-access-tgb84\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.784050 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.784078 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.885878 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.885966 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.886034 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgb84\" (UniqueName: \"kubernetes.io/projected/427c5d0e-a085-4795-9df8-47584898bc8c-kube-api-access-tgb84\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.886560 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.886581 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.906235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgb84\" (UniqueName: \"kubernetes.io/projected/427c5d0e-a085-4795-9df8-47584898bc8c-kube-api-access-tgb84\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:26 crc kubenswrapper[4763]: I1201 09:26:26.992728 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:27 crc kubenswrapper[4763]: I1201 09:26:27.406555 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n"] Dec 01 09:26:27 crc kubenswrapper[4763]: I1201 09:26:27.932865 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" event={"ID":"427c5d0e-a085-4795-9df8-47584898bc8c","Type":"ContainerStarted","Data":"c8a1f441366df68ba0d72ff0e444bdecd67d4c07b3bca98d74eddcf426083ac4"} Dec 01 09:26:27 crc kubenswrapper[4763]: I1201 09:26:27.933120 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" event={"ID":"427c5d0e-a085-4795-9df8-47584898bc8c","Type":"ContainerStarted","Data":"a6972a2e42b31459db43c3b2651ac9f41a8988b05a56c37af5061562c75ddecd"} Dec 01 09:26:28 crc kubenswrapper[4763]: I1201 09:26:28.940085 4763 generic.go:334] "Generic (PLEG): container finished" podID="427c5d0e-a085-4795-9df8-47584898bc8c" containerID="c8a1f441366df68ba0d72ff0e444bdecd67d4c07b3bca98d74eddcf426083ac4" exitCode=0 Dec 01 09:26:28 crc kubenswrapper[4763]: I1201 09:26:28.940182 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" event={"ID":"427c5d0e-a085-4795-9df8-47584898bc8c","Type":"ContainerDied","Data":"c8a1f441366df68ba0d72ff0e444bdecd67d4c07b3bca98d74eddcf426083ac4"} Dec 01 09:26:30 crc kubenswrapper[4763]: I1201 09:26:30.952546 4763 generic.go:334] "Generic (PLEG): container finished" podID="427c5d0e-a085-4795-9df8-47584898bc8c" containerID="e82524d3c74c5b671a6d0b3f26dc0bffe2271a3f9e5f74ce7173174cc80bce43" exitCode=0 Dec 01 09:26:30 crc kubenswrapper[4763]: I1201 09:26:30.952628 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" event={"ID":"427c5d0e-a085-4795-9df8-47584898bc8c","Type":"ContainerDied","Data":"e82524d3c74c5b671a6d0b3f26dc0bffe2271a3f9e5f74ce7173174cc80bce43"} Dec 01 09:26:31 crc kubenswrapper[4763]: I1201 09:26:31.961287 4763 generic.go:334] "Generic (PLEG): container finished" podID="427c5d0e-a085-4795-9df8-47584898bc8c" containerID="112b6e5fb96f1deec4717248682c66df74d6fda823171f9965c6ff3d6547f692" exitCode=0 Dec 01 09:26:31 crc kubenswrapper[4763]: I1201 09:26:31.961337 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" event={"ID":"427c5d0e-a085-4795-9df8-47584898bc8c","Type":"ContainerDied","Data":"112b6e5fb96f1deec4717248682c66df74d6fda823171f9965c6ff3d6547f692"} Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.193709 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.363551 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-util\") pod \"427c5d0e-a085-4795-9df8-47584898bc8c\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.363604 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-bundle\") pod \"427c5d0e-a085-4795-9df8-47584898bc8c\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.363679 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgb84\" (UniqueName: \"kubernetes.io/projected/427c5d0e-a085-4795-9df8-47584898bc8c-kube-api-access-tgb84\") pod \"427c5d0e-a085-4795-9df8-47584898bc8c\" (UID: \"427c5d0e-a085-4795-9df8-47584898bc8c\") " Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.364237 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-bundle" (OuterVolumeSpecName: "bundle") pod "427c5d0e-a085-4795-9df8-47584898bc8c" (UID: "427c5d0e-a085-4795-9df8-47584898bc8c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.369060 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427c5d0e-a085-4795-9df8-47584898bc8c-kube-api-access-tgb84" (OuterVolumeSpecName: "kube-api-access-tgb84") pod "427c5d0e-a085-4795-9df8-47584898bc8c" (UID: "427c5d0e-a085-4795-9df8-47584898bc8c"). InnerVolumeSpecName "kube-api-access-tgb84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.377867 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-util" (OuterVolumeSpecName: "util") pod "427c5d0e-a085-4795-9df8-47584898bc8c" (UID: "427c5d0e-a085-4795-9df8-47584898bc8c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.466351 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgb84\" (UniqueName: \"kubernetes.io/projected/427c5d0e-a085-4795-9df8-47584898bc8c-kube-api-access-tgb84\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.466386 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-util\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.466397 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/427c5d0e-a085-4795-9df8-47584898bc8c-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.976726 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" event={"ID":"427c5d0e-a085-4795-9df8-47584898bc8c","Type":"ContainerDied","Data":"a6972a2e42b31459db43c3b2651ac9f41a8988b05a56c37af5061562c75ddecd"} Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.977071 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6972a2e42b31459db43c3b2651ac9f41a8988b05a56c37af5061562c75ddecd" Dec 01 09:26:33 crc kubenswrapper[4763]: I1201 09:26:33.976763 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.362602 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f"] Dec 01 09:26:35 crc kubenswrapper[4763]: E1201 09:26:35.362811 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427c5d0e-a085-4795-9df8-47584898bc8c" containerName="pull" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.362824 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="427c5d0e-a085-4795-9df8-47584898bc8c" containerName="pull" Dec 01 09:26:35 crc kubenswrapper[4763]: E1201 09:26:35.362835 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427c5d0e-a085-4795-9df8-47584898bc8c" containerName="extract" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.362842 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="427c5d0e-a085-4795-9df8-47584898bc8c" containerName="extract" Dec 01 09:26:35 crc kubenswrapper[4763]: E1201 09:26:35.362853 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427c5d0e-a085-4795-9df8-47584898bc8c" containerName="util" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.362860 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="427c5d0e-a085-4795-9df8-47584898bc8c" containerName="util" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.362960 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="427c5d0e-a085-4795-9df8-47584898bc8c" containerName="extract" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.363354 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.365914 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-v85l8" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.366016 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.367755 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.376254 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f"] Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.410333 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgh8l\" (UniqueName: \"kubernetes.io/projected/8b28a543-e1fd-4862-af9e-b4c77a652700-kube-api-access-fgh8l\") pod \"nmstate-operator-5b5b58f5c8-5vb8f\" (UID: \"8b28a543-e1fd-4862-af9e-b4c77a652700\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.511332 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgh8l\" (UniqueName: \"kubernetes.io/projected/8b28a543-e1fd-4862-af9e-b4c77a652700-kube-api-access-fgh8l\") pod \"nmstate-operator-5b5b58f5c8-5vb8f\" (UID: \"8b28a543-e1fd-4862-af9e-b4c77a652700\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.532037 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgh8l\" (UniqueName: \"kubernetes.io/projected/8b28a543-e1fd-4862-af9e-b4c77a652700-kube-api-access-fgh8l\") pod \"nmstate-operator-5b5b58f5c8-5vb8f\" (UID: \"8b28a543-e1fd-4862-af9e-b4c77a652700\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.677437 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f" Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.857622 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f"] Dec 01 09:26:35 crc kubenswrapper[4763]: W1201 09:26:35.859592 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b28a543_e1fd_4862_af9e_b4c77a652700.slice/crio-9998e38a78a8bcbac673376fab321720b093dedfb39e8586b18a23f1d0dfe96c WatchSource:0}: Error finding container 9998e38a78a8bcbac673376fab321720b093dedfb39e8586b18a23f1d0dfe96c: Status 404 returned error can't find the container with id 9998e38a78a8bcbac673376fab321720b093dedfb39e8586b18a23f1d0dfe96c Dec 01 09:26:35 crc kubenswrapper[4763]: I1201 09:26:35.987380 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f" event={"ID":"8b28a543-e1fd-4862-af9e-b4c77a652700","Type":"ContainerStarted","Data":"9998e38a78a8bcbac673376fab321720b093dedfb39e8586b18a23f1d0dfe96c"} Dec 01 09:26:41 crc kubenswrapper[4763]: I1201 09:26:41.010880 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f" event={"ID":"8b28a543-e1fd-4862-af9e-b4c77a652700","Type":"ContainerStarted","Data":"57d6cb7f59c77fd3a51d90f3c2aaa30da3bc716ed4ef8762623ff566d83eb71b"} Dec 01 09:26:41 crc kubenswrapper[4763]: I1201 09:26:41.028027 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-5vb8f" podStartSLOduration=1.9678855629999998 podStartE2EDuration="6.02800352s" podCreationTimestamp="2025-12-01 09:26:35 +0000 UTC" firstStartedPulling="2025-12-01 09:26:35.86359749 +0000 UTC m=+713.132246258" lastFinishedPulling="2025-12-01 09:26:39.923715447 +0000 UTC m=+717.192364215" observedRunningTime="2025-12-01 09:26:41.024803217 +0000 UTC m=+718.293451985" watchObservedRunningTime="2025-12-01 09:26:41.02800352 +0000 UTC m=+718.296652288" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.002264 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.003600 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.005293 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vtcj5" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.013835 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.035824 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.036721 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:42 crc kubenswrapper[4763]: W1201 09:26:42.042793 4763 reflector.go:561] object-"openshift-nmstate"/"openshift-nmstate-webhook": failed to list *v1.Secret: secrets "openshift-nmstate-webhook" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Dec 01 09:26:42 crc kubenswrapper[4763]: E1201 09:26:42.042835 4763 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"openshift-nmstate-webhook\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-nmstate-webhook\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.072003 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-h8q8q"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.072757 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.087655 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.088980 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h76hg\" (UniqueName: \"kubernetes.io/projected/6789e53e-89e7-4593-a298-4b9eb0e0cf24-kube-api-access-h76hg\") pod \"nmstate-metrics-7f946cbc9-v2v9v\" (UID: \"6789e53e-89e7-4593-a298-4b9eb0e0cf24\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.089011 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kpwt\" (UniqueName: \"kubernetes.io/projected/31ef1fc8-adab-4f75-bac1-e8ff859eb425-kube-api-access-4kpwt\") pod \"nmstate-webhook-5f6d4c5ccb-z2gjv\" (UID: \"31ef1fc8-adab-4f75-bac1-e8ff859eb425\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.089045 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/31ef1fc8-adab-4f75-bac1-e8ff859eb425-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-z2gjv\" (UID: \"31ef1fc8-adab-4f75-bac1-e8ff859eb425\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.165492 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.166379 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.168828 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mkmtt" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.169005 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.169111 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.184255 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.200163 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsxl8\" (UniqueName: \"kubernetes.io/projected/1e79baf9-ce6c-4a92-891f-54eba3049168-kube-api-access-xsxl8\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.200249 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-ovs-socket\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.200323 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-dbus-socket\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.200489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h76hg\" (UniqueName: \"kubernetes.io/projected/6789e53e-89e7-4593-a298-4b9eb0e0cf24-kube-api-access-h76hg\") pod \"nmstate-metrics-7f946cbc9-v2v9v\" (UID: \"6789e53e-89e7-4593-a298-4b9eb0e0cf24\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.200541 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kpwt\" (UniqueName: \"kubernetes.io/projected/31ef1fc8-adab-4f75-bac1-e8ff859eb425-kube-api-access-4kpwt\") pod \"nmstate-webhook-5f6d4c5ccb-z2gjv\" (UID: \"31ef1fc8-adab-4f75-bac1-e8ff859eb425\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.200571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-nmstate-lock\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.200601 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/31ef1fc8-adab-4f75-bac1-e8ff859eb425-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-z2gjv\" (UID: \"31ef1fc8-adab-4f75-bac1-e8ff859eb425\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.221690 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kpwt\" (UniqueName: \"kubernetes.io/projected/31ef1fc8-adab-4f75-bac1-e8ff859eb425-kube-api-access-4kpwt\") pod \"nmstate-webhook-5f6d4c5ccb-z2gjv\" (UID: \"31ef1fc8-adab-4f75-bac1-e8ff859eb425\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.237144 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h76hg\" (UniqueName: \"kubernetes.io/projected/6789e53e-89e7-4593-a298-4b9eb0e0cf24-kube-api-access-h76hg\") pod \"nmstate-metrics-7f946cbc9-v2v9v\" (UID: \"6789e53e-89e7-4593-a298-4b9eb0e0cf24\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.301872 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.301919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-nmstate-lock\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.301959 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsxl8\" (UniqueName: \"kubernetes.io/projected/1e79baf9-ce6c-4a92-891f-54eba3049168-kube-api-access-xsxl8\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.301981 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-ovs-socket\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.302016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-ovs-socket\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.302057 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.302092 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkqzd\" (UniqueName: \"kubernetes.io/projected/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-kube-api-access-wkqzd\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.302107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-nmstate-lock\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.302127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-dbus-socket\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.302293 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1e79baf9-ce6c-4a92-891f-54eba3049168-dbus-socket\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.327662 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.328069 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsxl8\" (UniqueName: \"kubernetes.io/projected/1e79baf9-ce6c-4a92-891f-54eba3049168-kube-api-access-xsxl8\") pod \"nmstate-handler-h8q8q\" (UID: \"1e79baf9-ce6c-4a92-891f-54eba3049168\") " pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.397091 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-745856c86d-x4rhc"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.397757 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.401591 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.402912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.403003 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkqzd\" (UniqueName: \"kubernetes.io/projected/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-kube-api-access-wkqzd\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.403060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: E1201 09:26:42.403082 4763 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 01 09:26:42 crc kubenswrapper[4763]: E1201 09:26:42.403144 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-plugin-serving-cert podName:757fd525-a1b2-45c0-a3eb-7b8c3d6423d3 nodeName:}" failed. No retries permitted until 2025-12-01 09:26:42.903121341 +0000 UTC m=+720.171770109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-rsnnh" (UID: "757fd525-a1b2-45c0-a3eb-7b8c3d6423d3") : secret "plugin-serving-cert" not found Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.403932 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.418939 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-745856c86d-x4rhc"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.436626 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkqzd\" (UniqueName: \"kubernetes.io/projected/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-kube-api-access-wkqzd\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.515311 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sfhg\" (UniqueName: \"kubernetes.io/projected/97920ef4-5944-4153-bc01-e7a8a86d35c8-kube-api-access-6sfhg\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.515358 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-oauth-config\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.515379 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-config\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.515417 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-oauth-serving-cert\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.515442 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-serving-cert\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.515486 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-trusted-ca-bundle\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.515541 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-service-ca\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.617047 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-config\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.617127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-oauth-serving-cert\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.617154 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-serving-cert\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.617194 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-trusted-ca-bundle\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.617269 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-service-ca\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.617296 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sfhg\" (UniqueName: \"kubernetes.io/projected/97920ef4-5944-4153-bc01-e7a8a86d35c8-kube-api-access-6sfhg\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.617314 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-oauth-config\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.618312 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-config\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.618773 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-oauth-serving-cert\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.619312 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-trusted-ca-bundle\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.619566 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97920ef4-5944-4153-bc01-e7a8a86d35c8-service-ca\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.623985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-oauth-config\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.624039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97920ef4-5944-4153-bc01-e7a8a86d35c8-console-serving-cert\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.641333 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.643440 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sfhg\" (UniqueName: \"kubernetes.io/projected/97920ef4-5944-4153-bc01-e7a8a86d35c8-kube-api-access-6sfhg\") pod \"console-745856c86d-x4rhc\" (UID: \"97920ef4-5944-4153-bc01-e7a8a86d35c8\") " pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: W1201 09:26:42.652824 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6789e53e_89e7_4593_a298_4b9eb0e0cf24.slice/crio-f9e3765c2dd5c13fdb2f9c9903f029a9fc895380628e11699abedde8fd8521d5 WatchSource:0}: Error finding container f9e3765c2dd5c13fdb2f9c9903f029a9fc895380628e11699abedde8fd8521d5: Status 404 returned error can't find the container with id f9e3765c2dd5c13fdb2f9c9903f029a9fc895380628e11699abedde8fd8521d5 Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.717165 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.879301 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-745856c86d-x4rhc"] Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.921578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:42 crc kubenswrapper[4763]: I1201 09:26:42.925831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/757fd525-a1b2-45c0-a3eb-7b8c3d6423d3-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-rsnnh\" (UID: \"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.025175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-745856c86d-x4rhc" event={"ID":"97920ef4-5944-4153-bc01-e7a8a86d35c8","Type":"ContainerStarted","Data":"20efc6c83a7b7da9378f98830481e88cbff415ffa16223475dbcd0bddddf1a75"} Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.026304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h8q8q" event={"ID":"1e79baf9-ce6c-4a92-891f-54eba3049168","Type":"ContainerStarted","Data":"fc70190291cb4ba23824936af2b3de52ed75a05708c894427f8ebef3e02b82e8"} Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.027932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" event={"ID":"6789e53e-89e7-4593-a298-4b9eb0e0cf24","Type":"ContainerStarted","Data":"f9e3765c2dd5c13fdb2f9c9903f029a9fc895380628e11699abedde8fd8521d5"} Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.049240 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.055167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/31ef1fc8-adab-4f75-bac1-e8ff859eb425-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-z2gjv\" (UID: \"31ef1fc8-adab-4f75-bac1-e8ff859eb425\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.082575 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mkmtt" Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.090612 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.262608 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.397413 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh"] Dec 01 09:26:43 crc kubenswrapper[4763]: W1201 09:26:43.408736 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757fd525_a1b2_45c0_a3eb_7b8c3d6423d3.slice/crio-5c43b7ccb738106294ce02289c1c0354f5754ff653d60d52deafceb0df502e1c WatchSource:0}: Error finding container 5c43b7ccb738106294ce02289c1c0354f5754ff653d60d52deafceb0df502e1c: Status 404 returned error can't find the container with id 5c43b7ccb738106294ce02289c1c0354f5754ff653d60d52deafceb0df502e1c Dec 01 09:26:43 crc kubenswrapper[4763]: I1201 09:26:43.543160 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv"] Dec 01 09:26:43 crc kubenswrapper[4763]: W1201 09:26:43.550591 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31ef1fc8_adab_4f75_bac1_e8ff859eb425.slice/crio-9c14e3c9f9a1b928da10073dc40682c673eedfe5965af50fe4eb376e4160044e WatchSource:0}: Error finding container 9c14e3c9f9a1b928da10073dc40682c673eedfe5965af50fe4eb376e4160044e: Status 404 returned error can't find the container with id 9c14e3c9f9a1b928da10073dc40682c673eedfe5965af50fe4eb376e4160044e Dec 01 09:26:44 crc kubenswrapper[4763]: I1201 09:26:44.033495 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" event={"ID":"31ef1fc8-adab-4f75-bac1-e8ff859eb425","Type":"ContainerStarted","Data":"9c14e3c9f9a1b928da10073dc40682c673eedfe5965af50fe4eb376e4160044e"} Dec 01 09:26:44 crc kubenswrapper[4763]: I1201 09:26:44.034908 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-745856c86d-x4rhc" event={"ID":"97920ef4-5944-4153-bc01-e7a8a86d35c8","Type":"ContainerStarted","Data":"3ccc587b52aa06d4d9664472ae2b78aa3bb84fe378ec6ed0c048b7cc7a3b42e4"} Dec 01 09:26:44 crc kubenswrapper[4763]: I1201 09:26:44.035669 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" event={"ID":"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3","Type":"ContainerStarted","Data":"5c43b7ccb738106294ce02289c1c0354f5754ff653d60d52deafceb0df502e1c"} Dec 01 09:26:44 crc kubenswrapper[4763]: I1201 09:26:44.054276 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-745856c86d-x4rhc" podStartSLOduration=2.054255406 podStartE2EDuration="2.054255406s" podCreationTimestamp="2025-12-01 09:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:26:44.050758874 +0000 UTC m=+721.319407672" watchObservedRunningTime="2025-12-01 09:26:44.054255406 +0000 UTC m=+721.322904174" Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.051086 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" event={"ID":"757fd525-a1b2-45c0-a3eb-7b8c3d6423d3","Type":"ContainerStarted","Data":"bbbba206a6f75bf32035f6aec3c8a4de0fe5ce34df80326e80589ac1fd0db6da"} Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.054316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h8q8q" event={"ID":"1e79baf9-ce6c-4a92-891f-54eba3049168","Type":"ContainerStarted","Data":"899f3cf0ef3bc758348bcfbd633b76e7ef4e485eb338b14bc5df50a77662c262"} Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.054769 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.056335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" event={"ID":"6789e53e-89e7-4593-a298-4b9eb0e0cf24","Type":"ContainerStarted","Data":"eb95819ba9e8520f13a5ce188d1eb5c3291c43a79034ac00ee75787484703f98"} Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.057519 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" event={"ID":"31ef1fc8-adab-4f75-bac1-e8ff859eb425","Type":"ContainerStarted","Data":"707a0fbedb869e0c261e3c9266477471ecd2c4c7560eb323adab341d42f1c232"} Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.058211 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.073665 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rsnnh" podStartSLOduration=2.06952227 podStartE2EDuration="5.073645032s" podCreationTimestamp="2025-12-01 09:26:42 +0000 UTC" firstStartedPulling="2025-12-01 09:26:43.414635438 +0000 UTC m=+720.683284216" lastFinishedPulling="2025-12-01 09:26:46.41875821 +0000 UTC m=+723.687406978" observedRunningTime="2025-12-01 09:26:47.07278551 +0000 UTC m=+724.341434278" watchObservedRunningTime="2025-12-01 09:26:47.073645032 +0000 UTC m=+724.342293800" Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.098426 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-h8q8q" podStartSLOduration=1.11061869 podStartE2EDuration="5.098400465s" podCreationTimestamp="2025-12-01 09:26:42 +0000 UTC" firstStartedPulling="2025-12-01 09:26:42.437976286 +0000 UTC m=+719.706625054" lastFinishedPulling="2025-12-01 09:26:46.425758051 +0000 UTC m=+723.694406829" observedRunningTime="2025-12-01 09:26:47.091084985 +0000 UTC m=+724.359733753" watchObservedRunningTime="2025-12-01 09:26:47.098400465 +0000 UTC m=+724.367049233" Dec 01 09:26:47 crc kubenswrapper[4763]: I1201 09:26:47.124099 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" podStartSLOduration=2.227805328 podStartE2EDuration="5.124077171s" podCreationTimestamp="2025-12-01 09:26:42 +0000 UTC" firstStartedPulling="2025-12-01 09:26:43.551960631 +0000 UTC m=+720.820609399" lastFinishedPulling="2025-12-01 09:26:46.448232474 +0000 UTC m=+723.716881242" observedRunningTime="2025-12-01 09:26:47.123384173 +0000 UTC m=+724.392032941" watchObservedRunningTime="2025-12-01 09:26:47.124077171 +0000 UTC m=+724.392725939" Dec 01 09:26:49 crc kubenswrapper[4763]: I1201 09:26:49.071941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" event={"ID":"6789e53e-89e7-4593-a298-4b9eb0e0cf24","Type":"ContainerStarted","Data":"0e54e0e9e01361082f016599eba17b528c109a09376e63a208dc875c8abd9aa4"} Dec 01 09:26:49 crc kubenswrapper[4763]: I1201 09:26:49.103125 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-v2v9v" podStartSLOduration=1.836952959 podStartE2EDuration="8.103106483s" podCreationTimestamp="2025-12-01 09:26:41 +0000 UTC" firstStartedPulling="2025-12-01 09:26:42.654593357 +0000 UTC m=+719.923242125" lastFinishedPulling="2025-12-01 09:26:48.920746871 +0000 UTC m=+726.189395649" observedRunningTime="2025-12-01 09:26:49.095181987 +0000 UTC m=+726.363830755" watchObservedRunningTime="2025-12-01 09:26:49.103106483 +0000 UTC m=+726.371755251" Dec 01 09:26:52 crc kubenswrapper[4763]: I1201 09:26:52.425957 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-h8q8q" Dec 01 09:26:52 crc kubenswrapper[4763]: I1201 09:26:52.717840 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:52 crc kubenswrapper[4763]: I1201 09:26:52.717891 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:52 crc kubenswrapper[4763]: I1201 09:26:52.723028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:53 crc kubenswrapper[4763]: I1201 09:26:53.094988 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-745856c86d-x4rhc" Dec 01 09:26:53 crc kubenswrapper[4763]: I1201 09:26:53.156979 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-458q5"] Dec 01 09:27:03 crc kubenswrapper[4763]: I1201 09:27:03.269733 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-z2gjv" Dec 01 09:27:14 crc kubenswrapper[4763]: I1201 09:27:14.245252 4763 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.377206 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv"] Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.379742 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.381630 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.386773 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv"] Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.443362 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8lv\" (UniqueName: \"kubernetes.io/projected/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-kube-api-access-dq8lv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.443455 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.443504 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.544664 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8lv\" (UniqueName: \"kubernetes.io/projected/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-kube-api-access-dq8lv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.544954 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.545071 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.545508 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.545530 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.579651 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8lv\" (UniqueName: \"kubernetes.io/projected/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-kube-api-access-dq8lv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:17 crc kubenswrapper[4763]: I1201 09:27:17.696120 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.013880 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv"] Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.195975 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-458q5" podUID="f0ccd14f-5d77-4541-860f-d834079cf97f" containerName="console" containerID="cri-o://2eab0facfcd9cd78e1730283d2893dfdb96618082a7fd0468ec6aebf59b5b5ec" gracePeriod=15 Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.434355 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerID="6f238abad0d5fcf438d75ed496ab68d60442cd2c9499b9cd87f783c7b2aa65a6" exitCode=0 Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.434514 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" event={"ID":"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94","Type":"ContainerDied","Data":"6f238abad0d5fcf438d75ed496ab68d60442cd2c9499b9cd87f783c7b2aa65a6"} Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.434749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" event={"ID":"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94","Type":"ContainerStarted","Data":"e021b81ae5bd0b388ec0ab101d401bd13f126bab314259692f2dbceba9af53f2"} Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.446258 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-458q5_f0ccd14f-5d77-4541-860f-d834079cf97f/console/0.log" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.446294 4763 generic.go:334] "Generic (PLEG): container finished" podID="f0ccd14f-5d77-4541-860f-d834079cf97f" containerID="2eab0facfcd9cd78e1730283d2893dfdb96618082a7fd0468ec6aebf59b5b5ec" exitCode=2 Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.446322 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-458q5" event={"ID":"f0ccd14f-5d77-4541-860f-d834079cf97f","Type":"ContainerDied","Data":"2eab0facfcd9cd78e1730283d2893dfdb96618082a7fd0468ec6aebf59b5b5ec"} Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.511682 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-458q5_f0ccd14f-5d77-4541-860f-d834079cf97f/console/0.log" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.511749 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.565223 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-trusted-ca-bundle\") pod \"f0ccd14f-5d77-4541-860f-d834079cf97f\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.565265 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-console-config\") pod \"f0ccd14f-5d77-4541-860f-d834079cf97f\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.565289 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-service-ca\") pod \"f0ccd14f-5d77-4541-860f-d834079cf97f\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.565305 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-oauth-config\") pod \"f0ccd14f-5d77-4541-860f-d834079cf97f\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.565323 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-serving-cert\") pod \"f0ccd14f-5d77-4541-860f-d834079cf97f\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.565346 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cknl4\" (UniqueName: \"kubernetes.io/projected/f0ccd14f-5d77-4541-860f-d834079cf97f-kube-api-access-cknl4\") pod \"f0ccd14f-5d77-4541-860f-d834079cf97f\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.565371 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-oauth-serving-cert\") pod \"f0ccd14f-5d77-4541-860f-d834079cf97f\" (UID: \"f0ccd14f-5d77-4541-860f-d834079cf97f\") " Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.566301 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f0ccd14f-5d77-4541-860f-d834079cf97f" (UID: "f0ccd14f-5d77-4541-860f-d834079cf97f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.566874 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f0ccd14f-5d77-4541-860f-d834079cf97f" (UID: "f0ccd14f-5d77-4541-860f-d834079cf97f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.566978 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-console-config" (OuterVolumeSpecName: "console-config") pod "f0ccd14f-5d77-4541-860f-d834079cf97f" (UID: "f0ccd14f-5d77-4541-860f-d834079cf97f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.567272 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-service-ca" (OuterVolumeSpecName: "service-ca") pod "f0ccd14f-5d77-4541-860f-d834079cf97f" (UID: "f0ccd14f-5d77-4541-860f-d834079cf97f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.574655 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ccd14f-5d77-4541-860f-d834079cf97f-kube-api-access-cknl4" (OuterVolumeSpecName: "kube-api-access-cknl4") pod "f0ccd14f-5d77-4541-860f-d834079cf97f" (UID: "f0ccd14f-5d77-4541-860f-d834079cf97f"). InnerVolumeSpecName "kube-api-access-cknl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.575153 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f0ccd14f-5d77-4541-860f-d834079cf97f" (UID: "f0ccd14f-5d77-4541-860f-d834079cf97f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.571960 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f0ccd14f-5d77-4541-860f-d834079cf97f" (UID: "f0ccd14f-5d77-4541-860f-d834079cf97f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.666489 4763 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.666532 4763 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.666541 4763 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0ccd14f-5d77-4541-860f-d834079cf97f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.666550 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cknl4\" (UniqueName: \"kubernetes.io/projected/f0ccd14f-5d77-4541-860f-d834079cf97f-kube-api-access-cknl4\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.666558 4763 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.666566 4763 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:18 crc kubenswrapper[4763]: I1201 09:27:18.666576 4763 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0ccd14f-5d77-4541-860f-d834079cf97f-console-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.457624 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-458q5_f0ccd14f-5d77-4541-860f-d834079cf97f/console/0.log" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.458060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-458q5" event={"ID":"f0ccd14f-5d77-4541-860f-d834079cf97f","Type":"ContainerDied","Data":"061f789e392c10e7cac7f92148a23e57d121bb2bffa97a7cddf22188524862f3"} Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.458106 4763 scope.go:117] "RemoveContainer" containerID="2eab0facfcd9cd78e1730283d2893dfdb96618082a7fd0468ec6aebf59b5b5ec" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.458195 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-458q5" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.486766 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-458q5"] Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.491311 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-458q5"] Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.756146 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xrbq2"] Dec 01 09:27:19 crc kubenswrapper[4763]: E1201 09:27:19.756490 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ccd14f-5d77-4541-860f-d834079cf97f" containerName="console" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.756506 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ccd14f-5d77-4541-860f-d834079cf97f" containerName="console" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.756668 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ccd14f-5d77-4541-860f-d834079cf97f" containerName="console" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.759312 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.781040 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrbq2"] Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.783602 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-catalog-content\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.783661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7vpc\" (UniqueName: \"kubernetes.io/projected/be49bf95-141c-4034-ac19-ad1f4b9dbe30-kube-api-access-d7vpc\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.783737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-utilities\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.885101 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-utilities\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.885238 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-catalog-content\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.885261 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7vpc\" (UniqueName: \"kubernetes.io/projected/be49bf95-141c-4034-ac19-ad1f4b9dbe30-kube-api-access-d7vpc\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.886061 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-utilities\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.886115 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-catalog-content\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:19 crc kubenswrapper[4763]: I1201 09:27:19.904858 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7vpc\" (UniqueName: \"kubernetes.io/projected/be49bf95-141c-4034-ac19-ad1f4b9dbe30-kube-api-access-d7vpc\") pod \"redhat-operators-xrbq2\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:20 crc kubenswrapper[4763]: I1201 09:27:20.079074 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:20 crc kubenswrapper[4763]: I1201 09:27:20.297728 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrbq2"] Dec 01 09:27:20 crc kubenswrapper[4763]: I1201 09:27:20.465434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrbq2" event={"ID":"be49bf95-141c-4034-ac19-ad1f4b9dbe30","Type":"ContainerStarted","Data":"8a7149f634211acca99d059cd4a882b5a863ec8afc7991818db284e02d2642a9"} Dec 01 09:27:21 crc kubenswrapper[4763]: I1201 09:27:21.000736 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ccd14f-5d77-4541-860f-d834079cf97f" path="/var/lib/kubelet/pods/f0ccd14f-5d77-4541-860f-d834079cf97f/volumes" Dec 01 09:27:21 crc kubenswrapper[4763]: I1201 09:27:21.470959 4763 generic.go:334] "Generic (PLEG): container finished" podID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerID="c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37" exitCode=0 Dec 01 09:27:21 crc kubenswrapper[4763]: I1201 09:27:21.471004 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrbq2" event={"ID":"be49bf95-141c-4034-ac19-ad1f4b9dbe30","Type":"ContainerDied","Data":"c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37"} Dec 01 09:27:21 crc kubenswrapper[4763]: I1201 09:27:21.472658 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerID="4bb34e20518c61d13a063bfaae27de831a3e2f056957e357edadc1dde62a5f9e" exitCode=0 Dec 01 09:27:21 crc kubenswrapper[4763]: I1201 09:27:21.472688 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" event={"ID":"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94","Type":"ContainerDied","Data":"4bb34e20518c61d13a063bfaae27de831a3e2f056957e357edadc1dde62a5f9e"} Dec 01 09:27:22 crc kubenswrapper[4763]: I1201 09:27:22.482887 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrbq2" event={"ID":"be49bf95-141c-4034-ac19-ad1f4b9dbe30","Type":"ContainerStarted","Data":"99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7"} Dec 01 09:27:22 crc kubenswrapper[4763]: I1201 09:27:22.494630 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerID="c64f747bb106f119a16dd4f794c5210135662feef120c80ce6080e3c8e60fe7e" exitCode=0 Dec 01 09:27:22 crc kubenswrapper[4763]: I1201 09:27:22.495097 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" event={"ID":"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94","Type":"ContainerDied","Data":"c64f747bb106f119a16dd4f794c5210135662feef120c80ce6080e3c8e60fe7e"} Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.077127 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.266228 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-bundle\") pod \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.266316 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq8lv\" (UniqueName: \"kubernetes.io/projected/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-kube-api-access-dq8lv\") pod \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.266391 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-util\") pod \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\" (UID: \"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94\") " Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.267400 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-bundle" (OuterVolumeSpecName: "bundle") pod "fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" (UID: "fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.271994 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-kube-api-access-dq8lv" (OuterVolumeSpecName: "kube-api-access-dq8lv") pod "fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" (UID: "fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94"). InnerVolumeSpecName "kube-api-access-dq8lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.279747 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-util" (OuterVolumeSpecName: "util") pod "fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" (UID: "fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.367606 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-util\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.367645 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.367657 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq8lv\" (UniqueName: \"kubernetes.io/projected/fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94-kube-api-access-dq8lv\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.676038 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" event={"ID":"fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94","Type":"ContainerDied","Data":"e021b81ae5bd0b388ec0ab101d401bd13f126bab314259692f2dbceba9af53f2"} Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.676049 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.676076 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e021b81ae5bd0b388ec0ab101d401bd13f126bab314259692f2dbceba9af53f2" Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.679519 4763 generic.go:334] "Generic (PLEG): container finished" podID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerID="99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7" exitCode=0 Dec 01 09:27:24 crc kubenswrapper[4763]: I1201 09:27:24.679630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrbq2" event={"ID":"be49bf95-141c-4034-ac19-ad1f4b9dbe30","Type":"ContainerDied","Data":"99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7"} Dec 01 09:27:26 crc kubenswrapper[4763]: I1201 09:27:26.696859 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrbq2" event={"ID":"be49bf95-141c-4034-ac19-ad1f4b9dbe30","Type":"ContainerStarted","Data":"6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad"} Dec 01 09:27:26 crc kubenswrapper[4763]: I1201 09:27:26.717219 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xrbq2" podStartSLOduration=3.999968902 podStartE2EDuration="7.717201396s" podCreationTimestamp="2025-12-01 09:27:19 +0000 UTC" firstStartedPulling="2025-12-01 09:27:21.473150135 +0000 UTC m=+758.741798903" lastFinishedPulling="2025-12-01 09:27:25.190382619 +0000 UTC m=+762.459031397" observedRunningTime="2025-12-01 09:27:26.712654288 +0000 UTC m=+763.981303056" watchObservedRunningTime="2025-12-01 09:27:26.717201396 +0000 UTC m=+763.985850164" Dec 01 09:27:30 crc kubenswrapper[4763]: I1201 09:27:30.079342 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:30 crc kubenswrapper[4763]: I1201 09:27:30.079924 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:31 crc kubenswrapper[4763]: I1201 09:27:31.178072 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrbq2" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="registry-server" probeResult="failure" output=< Dec 01 09:27:31 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:27:31 crc kubenswrapper[4763]: > Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.487762 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5"] Dec 01 09:27:33 crc kubenswrapper[4763]: E1201 09:27:33.489133 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerName="pull" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.489215 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerName="pull" Dec 01 09:27:33 crc kubenswrapper[4763]: E1201 09:27:33.489278 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerName="extract" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.489336 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerName="extract" Dec 01 09:27:33 crc kubenswrapper[4763]: E1201 09:27:33.489411 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerName="util" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.489498 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerName="util" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.489681 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94" containerName="extract" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.490179 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.497435 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.497609 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.497612 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.497844 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.502634 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rzkz2" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.522910 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5"] Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.583711 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-apiservice-cert\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.584034 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzqkq\" (UniqueName: \"kubernetes.io/projected/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-kube-api-access-qzqkq\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.584293 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-webhook-cert\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.685937 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzqkq\" (UniqueName: \"kubernetes.io/projected/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-kube-api-access-qzqkq\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.686039 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-webhook-cert\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.686067 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-apiservice-cert\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.692029 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-apiservice-cert\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.716209 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-webhook-cert\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.720036 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725"] Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.726768 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.742130 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.743787 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.744173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzqkq\" (UniqueName: \"kubernetes.io/projected/e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5-kube-api-access-qzqkq\") pod \"metallb-operator-controller-manager-7488746df5-gj8c5\" (UID: \"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5\") " pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.744339 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-gjf6j" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.788419 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-webhook-cert\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.788557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-apiservice-cert\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.788621 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzjk8\" (UniqueName: \"kubernetes.io/projected/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-kube-api-access-jzjk8\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.805358 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.806404 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725"] Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.889998 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzjk8\" (UniqueName: \"kubernetes.io/projected/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-kube-api-access-jzjk8\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.890059 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-webhook-cert\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.890095 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-apiservice-cert\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.894054 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-webhook-cert\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.894256 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-apiservice-cert\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.937600 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.937655 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:27:33 crc kubenswrapper[4763]: I1201 09:27:33.938327 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzjk8\" (UniqueName: \"kubernetes.io/projected/eeaeb040-1a35-4174-9c9c-7ffe226a79e5-kube-api-access-jzjk8\") pod \"metallb-operator-webhook-server-7c7c865bc4-5b725\" (UID: \"eeaeb040-1a35-4174-9c9c-7ffe226a79e5\") " pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:34 crc kubenswrapper[4763]: I1201 09:27:34.137518 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:34 crc kubenswrapper[4763]: I1201 09:27:34.307816 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5"] Dec 01 09:27:34 crc kubenswrapper[4763]: I1201 09:27:34.663939 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725"] Dec 01 09:27:34 crc kubenswrapper[4763]: W1201 09:27:34.670023 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeaeb040_1a35_4174_9c9c_7ffe226a79e5.slice/crio-7f94dc29c27f69780c463a6a8c15050e073501958dc5b86c05a7391b0f8cdfdd WatchSource:0}: Error finding container 7f94dc29c27f69780c463a6a8c15050e073501958dc5b86c05a7391b0f8cdfdd: Status 404 returned error can't find the container with id 7f94dc29c27f69780c463a6a8c15050e073501958dc5b86c05a7391b0f8cdfdd Dec 01 09:27:34 crc kubenswrapper[4763]: I1201 09:27:34.739981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" event={"ID":"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5","Type":"ContainerStarted","Data":"0dcb330c95602b849cc9c8f4682f112e0640ff6d5a8f4464a349fffce9da3810"} Dec 01 09:27:34 crc kubenswrapper[4763]: I1201 09:27:34.741275 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" event={"ID":"eeaeb040-1a35-4174-9c9c-7ffe226a79e5","Type":"ContainerStarted","Data":"7f94dc29c27f69780c463a6a8c15050e073501958dc5b86c05a7391b0f8cdfdd"} Dec 01 09:27:40 crc kubenswrapper[4763]: I1201 09:27:40.149166 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:40 crc kubenswrapper[4763]: I1201 09:27:40.210027 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:41 crc kubenswrapper[4763]: I1201 09:27:41.937902 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrbq2"] Dec 01 09:27:41 crc kubenswrapper[4763]: I1201 09:27:41.938399 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xrbq2" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="registry-server" containerID="cri-o://6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad" gracePeriod=2 Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.372345 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.379929 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-catalog-content\") pod \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.380068 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-utilities\") pod \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.380913 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-utilities" (OuterVolumeSpecName: "utilities") pod "be49bf95-141c-4034-ac19-ad1f4b9dbe30" (UID: "be49bf95-141c-4034-ac19-ad1f4b9dbe30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.481401 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7vpc\" (UniqueName: \"kubernetes.io/projected/be49bf95-141c-4034-ac19-ad1f4b9dbe30-kube-api-access-d7vpc\") pod \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\" (UID: \"be49bf95-141c-4034-ac19-ad1f4b9dbe30\") " Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.498868 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.506678 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be49bf95-141c-4034-ac19-ad1f4b9dbe30-kube-api-access-d7vpc" (OuterVolumeSpecName: "kube-api-access-d7vpc") pod "be49bf95-141c-4034-ac19-ad1f4b9dbe30" (UID: "be49bf95-141c-4034-ac19-ad1f4b9dbe30"). InnerVolumeSpecName "kube-api-access-d7vpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.508500 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be49bf95-141c-4034-ac19-ad1f4b9dbe30" (UID: "be49bf95-141c-4034-ac19-ad1f4b9dbe30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.600892 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be49bf95-141c-4034-ac19-ad1f4b9dbe30-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.601258 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7vpc\" (UniqueName: \"kubernetes.io/projected/be49bf95-141c-4034-ac19-ad1f4b9dbe30-kube-api-access-d7vpc\") on node \"crc\" DevicePath \"\"" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.793194 4763 generic.go:334] "Generic (PLEG): container finished" podID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerID="6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad" exitCode=0 Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.793261 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrbq2" event={"ID":"be49bf95-141c-4034-ac19-ad1f4b9dbe30","Type":"ContainerDied","Data":"6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad"} Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.793287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrbq2" event={"ID":"be49bf95-141c-4034-ac19-ad1f4b9dbe30","Type":"ContainerDied","Data":"8a7149f634211acca99d059cd4a882b5a863ec8afc7991818db284e02d2642a9"} Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.793304 4763 scope.go:117] "RemoveContainer" containerID="6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.793308 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrbq2" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.795782 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" event={"ID":"e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5","Type":"ContainerStarted","Data":"fd858f7f75aa7d131f703466b2231436cbd80120faf14e2d7a0e031ed06b26bb"} Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.797060 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.823096 4763 scope.go:117] "RemoveContainer" containerID="99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.824486 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" event={"ID":"eeaeb040-1a35-4174-9c9c-7ffe226a79e5","Type":"ContainerStarted","Data":"cc68ab9152ca24fddbf4dc91162b856931430b4b58ab2c079bcf078e8201169d"} Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.825083 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.833799 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" podStartSLOduration=2.055255461 podStartE2EDuration="9.833775628s" podCreationTimestamp="2025-12-01 09:27:33 +0000 UTC" firstStartedPulling="2025-12-01 09:27:34.354640572 +0000 UTC m=+771.623289340" lastFinishedPulling="2025-12-01 09:27:42.133160739 +0000 UTC m=+779.401809507" observedRunningTime="2025-12-01 09:27:42.82729974 +0000 UTC m=+780.095948508" watchObservedRunningTime="2025-12-01 09:27:42.833775628 +0000 UTC m=+780.102424396" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.848389 4763 scope.go:117] "RemoveContainer" containerID="c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.874559 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" podStartSLOduration=2.39358097 podStartE2EDuration="9.874538155s" podCreationTimestamp="2025-12-01 09:27:33 +0000 UTC" firstStartedPulling="2025-12-01 09:27:34.673621119 +0000 UTC m=+771.942269887" lastFinishedPulling="2025-12-01 09:27:42.154578304 +0000 UTC m=+779.423227072" observedRunningTime="2025-12-01 09:27:42.855736478 +0000 UTC m=+780.124385246" watchObservedRunningTime="2025-12-01 09:27:42.874538155 +0000 UTC m=+780.143186923" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.879810 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrbq2"] Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.882412 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xrbq2"] Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.883280 4763 scope.go:117] "RemoveContainer" containerID="6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad" Dec 01 09:27:42 crc kubenswrapper[4763]: E1201 09:27:42.883829 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad\": container with ID starting with 6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad not found: ID does not exist" containerID="6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.883859 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad"} err="failed to get container status \"6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad\": rpc error: code = NotFound desc = could not find container \"6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad\": container with ID starting with 6635a17472e8b114b386256d9767ceae15c5fa69d1888186e4518ba33030c4ad not found: ID does not exist" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.883882 4763 scope.go:117] "RemoveContainer" containerID="99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7" Dec 01 09:27:42 crc kubenswrapper[4763]: E1201 09:27:42.884292 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7\": container with ID starting with 99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7 not found: ID does not exist" containerID="99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.884331 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7"} err="failed to get container status \"99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7\": rpc error: code = NotFound desc = could not find container \"99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7\": container with ID starting with 99b0d364b302a7c2b0d03ed42f2f7f1a5a5e6fa7407c8ec67830290fce64eec7 not found: ID does not exist" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.884357 4763 scope.go:117] "RemoveContainer" containerID="c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37" Dec 01 09:27:42 crc kubenswrapper[4763]: E1201 09:27:42.884895 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37\": container with ID starting with c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37 not found: ID does not exist" containerID="c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37" Dec 01 09:27:42 crc kubenswrapper[4763]: I1201 09:27:42.884915 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37"} err="failed to get container status \"c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37\": rpc error: code = NotFound desc = could not find container \"c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37\": container with ID starting with c4e64f95146bd66f047099b71d20a4d9e6f196fc4e45aef749be9618ce27ab37 not found: ID does not exist" Dec 01 09:27:43 crc kubenswrapper[4763]: I1201 09:27:43.008863 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" path="/var/lib/kubelet/pods/be49bf95-141c-4034-ac19-ad1f4b9dbe30/volumes" Dec 01 09:27:54 crc kubenswrapper[4763]: I1201 09:27:54.143471 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7c7c865bc4-5b725" Dec 01 09:28:03 crc kubenswrapper[4763]: I1201 09:28:03.929234 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:28:03 crc kubenswrapper[4763]: I1201 09:28:03.930813 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:28:13 crc kubenswrapper[4763]: I1201 09:28:13.811143 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7488746df5-gj8c5" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.574061 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cg9jb"] Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.574295 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="registry-server" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.574310 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="registry-server" Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.574331 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="extract-content" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.574337 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="extract-content" Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.574352 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="extract-utilities" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.574358 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="extract-utilities" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.574471 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="be49bf95-141c-4034-ac19-ad1f4b9dbe30" containerName="registry-server" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.576431 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.578863 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.578939 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.579067 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-wx68x" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.582043 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p"] Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.582791 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.589759 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.603835 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p"] Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.653701 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-startup\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.653855 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-reloader\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.653929 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-conf\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.653995 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-sockets\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.654073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fpxl\" (UniqueName: \"kubernetes.io/projected/9a10b30c-69e2-4037-bc91-dfd5191a6e72-kube-api-access-7fpxl\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.654099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1113818-415e-494e-8979-9de8da7db507-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-6kd8p\" (UID: \"e1113818-415e-494e-8979-9de8da7db507\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.654206 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-metrics\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.654274 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldj9s\" (UniqueName: \"kubernetes.io/projected/e1113818-415e-494e-8979-9de8da7db507-kube-api-access-ldj9s\") pod \"frr-k8s-webhook-server-7fcb986d4-6kd8p\" (UID: \"e1113818-415e-494e-8979-9de8da7db507\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.654306 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a10b30c-69e2-4037-bc91-dfd5191a6e72-metrics-certs\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.707230 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ljjmd"] Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.709048 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.713858 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.713988 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.714012 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.714134 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-gx769" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.734972 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-68xth"] Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.736053 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.751785 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.758051 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-sockets\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.758204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fpxl\" (UniqueName: \"kubernetes.io/projected/9a10b30c-69e2-4037-bc91-dfd5191a6e72-kube-api-access-7fpxl\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.758917 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-sockets\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.759191 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1113818-415e-494e-8979-9de8da7db507-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-6kd8p\" (UID: \"e1113818-415e-494e-8979-9de8da7db507\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.759368 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-metrics\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.759816 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-metrics\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.760017 4763 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.760203 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldj9s\" (UniqueName: \"kubernetes.io/projected/e1113818-415e-494e-8979-9de8da7db507-kube-api-access-ldj9s\") pod \"frr-k8s-webhook-server-7fcb986d4-6kd8p\" (UID: \"e1113818-415e-494e-8979-9de8da7db507\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.760802 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1113818-415e-494e-8979-9de8da7db507-cert podName:e1113818-415e-494e-8979-9de8da7db507 nodeName:}" failed. No retries permitted until 2025-12-01 09:28:15.260545095 +0000 UTC m=+812.529193863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1113818-415e-494e-8979-9de8da7db507-cert") pod "frr-k8s-webhook-server-7fcb986d4-6kd8p" (UID: "e1113818-415e-494e-8979-9de8da7db507") : secret "frr-k8s-webhook-server-cert" not found Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.761121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a10b30c-69e2-4037-bc91-dfd5191a6e72-metrics-certs\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.761268 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-startup\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.761435 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-reloader\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.761589 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-conf\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.761978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-conf\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.764295 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9a10b30c-69e2-4037-bc91-dfd5191a6e72-reloader\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.764573 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9a10b30c-69e2-4037-bc91-dfd5191a6e72-frr-startup\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.773416 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-68xth"] Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.774335 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a10b30c-69e2-4037-bc91-dfd5191a6e72-metrics-certs\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.803977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldj9s\" (UniqueName: \"kubernetes.io/projected/e1113818-415e-494e-8979-9de8da7db507-kube-api-access-ldj9s\") pod \"frr-k8s-webhook-server-7fcb986d4-6kd8p\" (UID: \"e1113818-415e-494e-8979-9de8da7db507\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.813845 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fpxl\" (UniqueName: \"kubernetes.io/projected/9a10b30c-69e2-4037-bc91-dfd5191a6e72-kube-api-access-7fpxl\") pod \"frr-k8s-cg9jb\" (UID: \"9a10b30c-69e2-4037-bc91-dfd5191a6e72\") " pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.864128 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39d8a539-ae28-40cb-b850-d40b3cc839b8-cert\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.864229 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39d8a539-ae28-40cb-b850-d40b3cc839b8-metrics-certs\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.864257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.864287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-metrics-certs\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.864319 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bd336d9e-af02-4fc6-ae34-147c379ba374-metallb-excludel2\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.864342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqdc7\" (UniqueName: \"kubernetes.io/projected/bd336d9e-af02-4fc6-ae34-147c379ba374-kube-api-access-rqdc7\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.864368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfjkh\" (UniqueName: \"kubernetes.io/projected/39d8a539-ae28-40cb-b850-d40b3cc839b8-kube-api-access-xfjkh\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.896946 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.965060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39d8a539-ae28-40cb-b850-d40b3cc839b8-metrics-certs\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.965097 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.965119 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-metrics-certs\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.965142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bd336d9e-af02-4fc6-ae34-147c379ba374-metallb-excludel2\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.965176 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqdc7\" (UniqueName: \"kubernetes.io/projected/bd336d9e-af02-4fc6-ae34-147c379ba374-kube-api-access-rqdc7\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.965195 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfjkh\" (UniqueName: \"kubernetes.io/projected/39d8a539-ae28-40cb-b850-d40b3cc839b8-kube-api-access-xfjkh\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.965216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39d8a539-ae28-40cb-b850-d40b3cc839b8-cert\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.966444 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bd336d9e-af02-4fc6-ae34-147c379ba374-metallb-excludel2\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.966681 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.966770 4763 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.966875 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist podName:bd336d9e-af02-4fc6-ae34-147c379ba374 nodeName:}" failed. No retries permitted until 2025-12-01 09:28:15.466786968 +0000 UTC m=+812.735435736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist") pod "speaker-ljjmd" (UID: "bd336d9e-af02-4fc6-ae34-147c379ba374") : secret "metallb-memberlist" not found Dec 01 09:28:14 crc kubenswrapper[4763]: E1201 09:28:14.966966 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-metrics-certs podName:bd336d9e-af02-4fc6-ae34-147c379ba374 nodeName:}" failed. No retries permitted until 2025-12-01 09:28:15.466958231 +0000 UTC m=+812.735606999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-metrics-certs") pod "speaker-ljjmd" (UID: "bd336d9e-af02-4fc6-ae34-147c379ba374") : secret "speaker-certs-secret" not found Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.969740 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39d8a539-ae28-40cb-b850-d40b3cc839b8-metrics-certs\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.974142 4763 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.979068 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39d8a539-ae28-40cb-b850-d40b3cc839b8-cert\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.989036 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqdc7\" (UniqueName: \"kubernetes.io/projected/bd336d9e-af02-4fc6-ae34-147c379ba374-kube-api-access-rqdc7\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:14 crc kubenswrapper[4763]: I1201 09:28:14.998400 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfjkh\" (UniqueName: \"kubernetes.io/projected/39d8a539-ae28-40cb-b850-d40b3cc839b8-kube-api-access-xfjkh\") pod \"controller-f8648f98b-68xth\" (UID: \"39d8a539-ae28-40cb-b850-d40b3cc839b8\") " pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.052961 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.269503 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-68xth"] Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.271313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1113818-415e-494e-8979-9de8da7db507-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-6kd8p\" (UID: \"e1113818-415e-494e-8979-9de8da7db507\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:15 crc kubenswrapper[4763]: W1201 09:28:15.272890 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d8a539_ae28_40cb_b850_d40b3cc839b8.slice/crio-c939d7cfedf555a74ed0cf4b0a9f79ace25e98796175e9b8235abcdc70dfd83a WatchSource:0}: Error finding container c939d7cfedf555a74ed0cf4b0a9f79ace25e98796175e9b8235abcdc70dfd83a: Status 404 returned error can't find the container with id c939d7cfedf555a74ed0cf4b0a9f79ace25e98796175e9b8235abcdc70dfd83a Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.276626 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1113818-415e-494e-8979-9de8da7db507-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-6kd8p\" (UID: \"e1113818-415e-494e-8979-9de8da7db507\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.474197 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.474265 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-metrics-certs\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:15 crc kubenswrapper[4763]: E1201 09:28:15.474806 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 09:28:15 crc kubenswrapper[4763]: E1201 09:28:15.474940 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist podName:bd336d9e-af02-4fc6-ae34-147c379ba374 nodeName:}" failed. No retries permitted until 2025-12-01 09:28:16.474918185 +0000 UTC m=+813.743566953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist") pod "speaker-ljjmd" (UID: "bd336d9e-af02-4fc6-ae34-147c379ba374") : secret "metallb-memberlist" not found Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.477648 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-metrics-certs\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.503280 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.697281 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p"] Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.998534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" event={"ID":"e1113818-415e-494e-8979-9de8da7db507","Type":"ContainerStarted","Data":"c08912cfe19a541d15f6267b822d4a01a62828f68123c05af37a6138172531b7"} Dec 01 09:28:15 crc kubenswrapper[4763]: I1201 09:28:15.999321 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-68xth" event={"ID":"39d8a539-ae28-40cb-b850-d40b3cc839b8","Type":"ContainerStarted","Data":"c939d7cfedf555a74ed0cf4b0a9f79ace25e98796175e9b8235abcdc70dfd83a"} Dec 01 09:28:16 crc kubenswrapper[4763]: I1201 09:28:16.487919 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:16 crc kubenswrapper[4763]: E1201 09:28:16.488142 4763 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 01 09:28:16 crc kubenswrapper[4763]: E1201 09:28:16.488236 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist podName:bd336d9e-af02-4fc6-ae34-147c379ba374 nodeName:}" failed. No retries permitted until 2025-12-01 09:28:18.488217161 +0000 UTC m=+815.756865929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist") pod "speaker-ljjmd" (UID: "bd336d9e-af02-4fc6-ae34-147c379ba374") : secret "metallb-memberlist" not found Dec 01 09:28:17 crc kubenswrapper[4763]: I1201 09:28:17.006183 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-68xth" event={"ID":"39d8a539-ae28-40cb-b850-d40b3cc839b8","Type":"ContainerStarted","Data":"dc5eec84063f76ba8fff8711c2f591ae1f07c722ac356e1ed0c41c055093cf0f"} Dec 01 09:28:17 crc kubenswrapper[4763]: I1201 09:28:17.015090 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerStarted","Data":"22215432243da3662e5fcd259c39338fe5504076414184e246558a74c34e643f"} Dec 01 09:28:18 crc kubenswrapper[4763]: I1201 09:28:18.022545 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-68xth" event={"ID":"39d8a539-ae28-40cb-b850-d40b3cc839b8","Type":"ContainerStarted","Data":"fcc280a8d01891e38e35a2ea34a2a80a38d4c50cb4cc6faf42cf27916210fc0f"} Dec 01 09:28:18 crc kubenswrapper[4763]: I1201 09:28:18.022843 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:18 crc kubenswrapper[4763]: I1201 09:28:18.516578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:18 crc kubenswrapper[4763]: I1201 09:28:18.535003 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bd336d9e-af02-4fc6-ae34-147c379ba374-memberlist\") pod \"speaker-ljjmd\" (UID: \"bd336d9e-af02-4fc6-ae34-147c379ba374\") " pod="metallb-system/speaker-ljjmd" Dec 01 09:28:18 crc kubenswrapper[4763]: I1201 09:28:18.643439 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ljjmd" Dec 01 09:28:18 crc kubenswrapper[4763]: W1201 09:28:18.670612 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd336d9e_af02_4fc6_ae34_147c379ba374.slice/crio-9d3c35f75b3cc1e2c4568624166844379e7b08bd14511114d251fffa9a73f6da WatchSource:0}: Error finding container 9d3c35f75b3cc1e2c4568624166844379e7b08bd14511114d251fffa9a73f6da: Status 404 returned error can't find the container with id 9d3c35f75b3cc1e2c4568624166844379e7b08bd14511114d251fffa9a73f6da Dec 01 09:28:19 crc kubenswrapper[4763]: I1201 09:28:19.029106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ljjmd" event={"ID":"bd336d9e-af02-4fc6-ae34-147c379ba374","Type":"ContainerStarted","Data":"e44669f0823282db130e8730f76fd21b26b4db7514209dad2af815822e1f48aa"} Dec 01 09:28:19 crc kubenswrapper[4763]: I1201 09:28:19.029172 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ljjmd" event={"ID":"bd336d9e-af02-4fc6-ae34-147c379ba374","Type":"ContainerStarted","Data":"9d3c35f75b3cc1e2c4568624166844379e7b08bd14511114d251fffa9a73f6da"} Dec 01 09:28:20 crc kubenswrapper[4763]: I1201 09:28:20.038869 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ljjmd" event={"ID":"bd336d9e-af02-4fc6-ae34-147c379ba374","Type":"ContainerStarted","Data":"6d29364a1de82a2077493a2af770fdb9c4f4c3291b4cf50cd637579d14a3bc77"} Dec 01 09:28:20 crc kubenswrapper[4763]: I1201 09:28:20.039340 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ljjmd" Dec 01 09:28:20 crc kubenswrapper[4763]: I1201 09:28:20.072318 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-68xth" podStartSLOduration=6.072290348 podStartE2EDuration="6.072290348s" podCreationTimestamp="2025-12-01 09:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:18.048255713 +0000 UTC m=+815.316904481" watchObservedRunningTime="2025-12-01 09:28:20.072290348 +0000 UTC m=+817.340939116" Dec 01 09:28:23 crc kubenswrapper[4763]: I1201 09:28:23.027011 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ljjmd" podStartSLOduration=9.026995089 podStartE2EDuration="9.026995089s" podCreationTimestamp="2025-12-01 09:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:28:20.066950276 +0000 UTC m=+817.335599044" watchObservedRunningTime="2025-12-01 09:28:23.026995089 +0000 UTC m=+820.295643857" Dec 01 09:28:25 crc kubenswrapper[4763]: I1201 09:28:25.079430 4763 generic.go:334] "Generic (PLEG): container finished" podID="9a10b30c-69e2-4037-bc91-dfd5191a6e72" containerID="108708b84c1e223891959d75fccfb5b160f885aa023c4b777887084a7a17b381" exitCode=0 Dec 01 09:28:25 crc kubenswrapper[4763]: I1201 09:28:25.079506 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerDied","Data":"108708b84c1e223891959d75fccfb5b160f885aa023c4b777887084a7a17b381"} Dec 01 09:28:25 crc kubenswrapper[4763]: I1201 09:28:25.083057 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" event={"ID":"e1113818-415e-494e-8979-9de8da7db507","Type":"ContainerStarted","Data":"964c825214ed34cd40630965c651d63ca7678c644e764b62135be37de5bb64a9"} Dec 01 09:28:25 crc kubenswrapper[4763]: I1201 09:28:25.083436 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:25 crc kubenswrapper[4763]: I1201 09:28:25.133145 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" podStartSLOduration=2.526818429 podStartE2EDuration="11.133125653s" podCreationTimestamp="2025-12-01 09:28:14 +0000 UTC" firstStartedPulling="2025-12-01 09:28:15.706622855 +0000 UTC m=+812.975271623" lastFinishedPulling="2025-12-01 09:28:24.312930079 +0000 UTC m=+821.581578847" observedRunningTime="2025-12-01 09:28:25.130288248 +0000 UTC m=+822.398937016" watchObservedRunningTime="2025-12-01 09:28:25.133125653 +0000 UTC m=+822.401774421" Dec 01 09:28:26 crc kubenswrapper[4763]: I1201 09:28:26.093820 4763 generic.go:334] "Generic (PLEG): container finished" podID="9a10b30c-69e2-4037-bc91-dfd5191a6e72" containerID="6bb988cc3e999402dd06cea774abaebe69ae5ca89d3800db881dc1ed3d91155e" exitCode=0 Dec 01 09:28:26 crc kubenswrapper[4763]: I1201 09:28:26.093894 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerDied","Data":"6bb988cc3e999402dd06cea774abaebe69ae5ca89d3800db881dc1ed3d91155e"} Dec 01 09:28:27 crc kubenswrapper[4763]: I1201 09:28:27.106938 4763 generic.go:334] "Generic (PLEG): container finished" podID="9a10b30c-69e2-4037-bc91-dfd5191a6e72" containerID="a769106dce69d212ab70049fc9dcb2a638338f9eb5a2d4df58b130eb5f47e7b1" exitCode=0 Dec 01 09:28:27 crc kubenswrapper[4763]: I1201 09:28:27.107063 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerDied","Data":"a769106dce69d212ab70049fc9dcb2a638338f9eb5a2d4df58b130eb5f47e7b1"} Dec 01 09:28:28 crc kubenswrapper[4763]: I1201 09:28:28.119744 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerStarted","Data":"8a05a78ef89a4ce1a54e9bc25144d227143cc801fbf8ba37446804d2ec91a649"} Dec 01 09:28:28 crc kubenswrapper[4763]: I1201 09:28:28.120078 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerStarted","Data":"9fb6b1cc0e2626cd9fbdcc075bfc636846c8c041d644116cebca0775feacd268"} Dec 01 09:28:28 crc kubenswrapper[4763]: I1201 09:28:28.120094 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerStarted","Data":"34d420af08dfb117efb3efc897ba794db6edc86f40a4b994621bc39528f8cfe2"} Dec 01 09:28:28 crc kubenswrapper[4763]: I1201 09:28:28.120106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerStarted","Data":"ac605753520e89621e4db8296611be1feb49beb015b0bcf806fd6e80ee96a1c1"} Dec 01 09:28:28 crc kubenswrapper[4763]: I1201 09:28:28.648349 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ljjmd" Dec 01 09:28:29 crc kubenswrapper[4763]: I1201 09:28:29.128746 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerStarted","Data":"878bc3e749135bc674a1fca69ad0b6f7171da49866e74aade9a7dec9b7dce4b1"} Dec 01 09:28:29 crc kubenswrapper[4763]: I1201 09:28:29.128997 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cg9jb" event={"ID":"9a10b30c-69e2-4037-bc91-dfd5191a6e72","Type":"ContainerStarted","Data":"9a9e59a2b57294da4b112e29474111142381d3e56c275c32b423d4f036c23aab"} Dec 01 09:28:29 crc kubenswrapper[4763]: I1201 09:28:29.130001 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:29 crc kubenswrapper[4763]: I1201 09:28:29.152093 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cg9jb" podStartSLOduration=7.571443185 podStartE2EDuration="15.152069055s" podCreationTimestamp="2025-12-01 09:28:14 +0000 UTC" firstStartedPulling="2025-12-01 09:28:16.73601709 +0000 UTC m=+814.004665858" lastFinishedPulling="2025-12-01 09:28:24.31664296 +0000 UTC m=+821.585291728" observedRunningTime="2025-12-01 09:28:29.148175751 +0000 UTC m=+826.416824529" watchObservedRunningTime="2025-12-01 09:28:29.152069055 +0000 UTC m=+826.420717823" Dec 01 09:28:29 crc kubenswrapper[4763]: I1201 09:28:29.897922 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:29 crc kubenswrapper[4763]: I1201 09:28:29.937739 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.756425 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kwrpc"] Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.757564 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kwrpc" Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.762651 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.762684 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-nk676" Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.768151 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.800196 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kwrpc"] Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.859633 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbqk\" (UniqueName: \"kubernetes.io/projected/d13e6ab8-1e1b-4632-a5e9-5574cb7f280e-kube-api-access-fzbqk\") pod \"openstack-operator-index-kwrpc\" (UID: \"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e\") " pod="openstack-operators/openstack-operator-index-kwrpc" Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.961359 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbqk\" (UniqueName: \"kubernetes.io/projected/d13e6ab8-1e1b-4632-a5e9-5574cb7f280e-kube-api-access-fzbqk\") pod \"openstack-operator-index-kwrpc\" (UID: \"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e\") " pod="openstack-operators/openstack-operator-index-kwrpc" Dec 01 09:28:31 crc kubenswrapper[4763]: I1201 09:28:31.990875 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbqk\" (UniqueName: \"kubernetes.io/projected/d13e6ab8-1e1b-4632-a5e9-5574cb7f280e-kube-api-access-fzbqk\") pod \"openstack-operator-index-kwrpc\" (UID: \"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e\") " pod="openstack-operators/openstack-operator-index-kwrpc" Dec 01 09:28:32 crc kubenswrapper[4763]: I1201 09:28:32.084155 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kwrpc" Dec 01 09:28:32 crc kubenswrapper[4763]: I1201 09:28:32.510223 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kwrpc"] Dec 01 09:28:32 crc kubenswrapper[4763]: W1201 09:28:32.519365 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13e6ab8_1e1b_4632_a5e9_5574cb7f280e.slice/crio-ea7603a86d1dce8204a0107ac5cd15773c6dc35163f23047df5e58268ea48d08 WatchSource:0}: Error finding container ea7603a86d1dce8204a0107ac5cd15773c6dc35163f23047df5e58268ea48d08: Status 404 returned error can't find the container with id ea7603a86d1dce8204a0107ac5cd15773c6dc35163f23047df5e58268ea48d08 Dec 01 09:28:33 crc kubenswrapper[4763]: I1201 09:28:33.158836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kwrpc" event={"ID":"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e","Type":"ContainerStarted","Data":"ea7603a86d1dce8204a0107ac5cd15773c6dc35163f23047df5e58268ea48d08"} Dec 01 09:28:33 crc kubenswrapper[4763]: I1201 09:28:33.929595 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:28:33 crc kubenswrapper[4763]: I1201 09:28:33.929991 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:28:33 crc kubenswrapper[4763]: I1201 09:28:33.930049 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:28:33 crc kubenswrapper[4763]: I1201 09:28:33.930701 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9e0e5adb882a530747c6596a975101cf0f536a3cb28e48dd137e2024a6a05f6"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:28:33 crc kubenswrapper[4763]: I1201 09:28:33.930760 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://d9e0e5adb882a530747c6596a975101cf0f536a3cb28e48dd137e2024a6a05f6" gracePeriod=600 Dec 01 09:28:34 crc kubenswrapper[4763]: I1201 09:28:34.167626 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="d9e0e5adb882a530747c6596a975101cf0f536a3cb28e48dd137e2024a6a05f6" exitCode=0 Dec 01 09:28:34 crc kubenswrapper[4763]: I1201 09:28:34.167665 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"d9e0e5adb882a530747c6596a975101cf0f536a3cb28e48dd137e2024a6a05f6"} Dec 01 09:28:34 crc kubenswrapper[4763]: I1201 09:28:34.167695 4763 scope.go:117] "RemoveContainer" containerID="2225c7eb7487a6ea4c1fbbf2365ffe08cfe7e24776ec45b7e270f3787e713463" Dec 01 09:28:34 crc kubenswrapper[4763]: I1201 09:28:34.915202 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kwrpc"] Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.058786 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-68xth" Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.525367 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-6kd8p" Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.526943 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-4qcm5"] Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.527731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.535770 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4qcm5"] Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.615525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvbcd\" (UniqueName: \"kubernetes.io/projected/f78f08c4-82f9-45c0-92d4-325c6e066d44-kube-api-access-nvbcd\") pod \"openstack-operator-index-4qcm5\" (UID: \"f78f08c4-82f9-45c0-92d4-325c6e066d44\") " pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.717610 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvbcd\" (UniqueName: \"kubernetes.io/projected/f78f08c4-82f9-45c0-92d4-325c6e066d44-kube-api-access-nvbcd\") pod \"openstack-operator-index-4qcm5\" (UID: \"f78f08c4-82f9-45c0-92d4-325c6e066d44\") " pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.746935 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvbcd\" (UniqueName: \"kubernetes.io/projected/f78f08c4-82f9-45c0-92d4-325c6e066d44-kube-api-access-nvbcd\") pod \"openstack-operator-index-4qcm5\" (UID: \"f78f08c4-82f9-45c0-92d4-325c6e066d44\") " pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:35 crc kubenswrapper[4763]: I1201 09:28:35.846653 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:36 crc kubenswrapper[4763]: I1201 09:28:36.181391 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"fbcaa44c81e6e848c09eeb8a68cb5f7f03225b440f52ed6609277022adeaf191"} Dec 01 09:28:36 crc kubenswrapper[4763]: I1201 09:28:36.472198 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-4qcm5"] Dec 01 09:28:36 crc kubenswrapper[4763]: W1201 09:28:36.522010 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78f08c4_82f9_45c0_92d4_325c6e066d44.slice/crio-c1a17430df102f9f8eefd8c2c15dc0f39426e39b4dbf84fc3899fc9e7f35b456 WatchSource:0}: Error finding container c1a17430df102f9f8eefd8c2c15dc0f39426e39b4dbf84fc3899fc9e7f35b456: Status 404 returned error can't find the container with id c1a17430df102f9f8eefd8c2c15dc0f39426e39b4dbf84fc3899fc9e7f35b456 Dec 01 09:28:37 crc kubenswrapper[4763]: I1201 09:28:37.188143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4qcm5" event={"ID":"f78f08c4-82f9-45c0-92d4-325c6e066d44","Type":"ContainerStarted","Data":"c1a17430df102f9f8eefd8c2c15dc0f39426e39b4dbf84fc3899fc9e7f35b456"} Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.213128 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-4qcm5" event={"ID":"f78f08c4-82f9-45c0-92d4-325c6e066d44","Type":"ContainerStarted","Data":"007a4d6140bfc4217fdd64cfedf5c5214baf9b3dea3b891dd744fe4dc7828eb1"} Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.215583 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kwrpc" event={"ID":"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e","Type":"ContainerStarted","Data":"8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7"} Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.215672 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-kwrpc" podUID="d13e6ab8-1e1b-4632-a5e9-5574cb7f280e" containerName="registry-server" containerID="cri-o://8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7" gracePeriod=2 Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.233571 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-4qcm5" podStartSLOduration=2.609332425 podStartE2EDuration="4.233551758s" podCreationTimestamp="2025-12-01 09:28:35 +0000 UTC" firstStartedPulling="2025-12-01 09:28:36.524747082 +0000 UTC m=+833.793395850" lastFinishedPulling="2025-12-01 09:28:38.148966405 +0000 UTC m=+835.417615183" observedRunningTime="2025-12-01 09:28:39.230907622 +0000 UTC m=+836.499556400" watchObservedRunningTime="2025-12-01 09:28:39.233551758 +0000 UTC m=+836.502200516" Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.247521 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kwrpc" podStartSLOduration=2.622036155 podStartE2EDuration="8.247504623s" podCreationTimestamp="2025-12-01 09:28:31 +0000 UTC" firstStartedPulling="2025-12-01 09:28:32.521362875 +0000 UTC m=+829.790011643" lastFinishedPulling="2025-12-01 09:28:38.146831323 +0000 UTC m=+835.415480111" observedRunningTime="2025-12-01 09:28:39.244942449 +0000 UTC m=+836.513591217" watchObservedRunningTime="2025-12-01 09:28:39.247504623 +0000 UTC m=+836.516153391" Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.583271 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kwrpc" Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.637248 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbqk\" (UniqueName: \"kubernetes.io/projected/d13e6ab8-1e1b-4632-a5e9-5574cb7f280e-kube-api-access-fzbqk\") pod \"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e\" (UID: \"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e\") " Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.661206 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13e6ab8-1e1b-4632-a5e9-5574cb7f280e-kube-api-access-fzbqk" (OuterVolumeSpecName: "kube-api-access-fzbqk") pod "d13e6ab8-1e1b-4632-a5e9-5574cb7f280e" (UID: "d13e6ab8-1e1b-4632-a5e9-5574cb7f280e"). InnerVolumeSpecName "kube-api-access-fzbqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:39 crc kubenswrapper[4763]: I1201 09:28:39.738358 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbqk\" (UniqueName: \"kubernetes.io/projected/d13e6ab8-1e1b-4632-a5e9-5574cb7f280e-kube-api-access-fzbqk\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.224210 4763 generic.go:334] "Generic (PLEG): container finished" podID="d13e6ab8-1e1b-4632-a5e9-5574cb7f280e" containerID="8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7" exitCode=0 Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.224275 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kwrpc" Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.224263 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kwrpc" event={"ID":"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e","Type":"ContainerDied","Data":"8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7"} Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.224412 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kwrpc" event={"ID":"d13e6ab8-1e1b-4632-a5e9-5574cb7f280e","Type":"ContainerDied","Data":"ea7603a86d1dce8204a0107ac5cd15773c6dc35163f23047df5e58268ea48d08"} Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.224432 4763 scope.go:117] "RemoveContainer" containerID="8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7" Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.252657 4763 scope.go:117] "RemoveContainer" containerID="8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7" Dec 01 09:28:40 crc kubenswrapper[4763]: E1201 09:28:40.253062 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7\": container with ID starting with 8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7 not found: ID does not exist" containerID="8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7" Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.253095 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7"} err="failed to get container status \"8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7\": rpc error: code = NotFound desc = could not find container \"8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7\": container with ID starting with 8bd1a914d9260e27c804b13c34f69cf5b752c8d4d9640834107bbb3110a297b7 not found: ID does not exist" Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.258509 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-kwrpc"] Dec 01 09:28:40 crc kubenswrapper[4763]: I1201 09:28:40.262536 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-kwrpc"] Dec 01 09:28:41 crc kubenswrapper[4763]: I1201 09:28:41.004312 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13e6ab8-1e1b-4632-a5e9-5574cb7f280e" path="/var/lib/kubelet/pods/d13e6ab8-1e1b-4632-a5e9-5574cb7f280e/volumes" Dec 01 09:28:44 crc kubenswrapper[4763]: I1201 09:28:44.901005 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cg9jb" Dec 01 09:28:45 crc kubenswrapper[4763]: I1201 09:28:45.847125 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:45 crc kubenswrapper[4763]: I1201 09:28:45.847182 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:45 crc kubenswrapper[4763]: I1201 09:28:45.889518 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:46 crc kubenswrapper[4763]: I1201 09:28:46.299145 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-4qcm5" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.356910 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr"] Dec 01 09:28:47 crc kubenswrapper[4763]: E1201 09:28:47.357463 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13e6ab8-1e1b-4632-a5e9-5574cb7f280e" containerName="registry-server" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.357475 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13e6ab8-1e1b-4632-a5e9-5574cb7f280e" containerName="registry-server" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.357579 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13e6ab8-1e1b-4632-a5e9-5574cb7f280e" containerName="registry-server" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.358529 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.360296 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-b9fgf" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.365404 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr"] Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.484761 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-bundle\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.484881 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-util\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.484909 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4j5\" (UniqueName: \"kubernetes.io/projected/40baea6c-c32e-4f93-b01e-d94c309c05f7-kube-api-access-2k4j5\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.586417 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-util\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.586557 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4j5\" (UniqueName: \"kubernetes.io/projected/40baea6c-c32e-4f93-b01e-d94c309c05f7-kube-api-access-2k4j5\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.586617 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-bundle\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.587225 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-util\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.587497 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-bundle\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.630895 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4j5\" (UniqueName: \"kubernetes.io/projected/40baea6c-c32e-4f93-b01e-d94c309c05f7-kube-api-access-2k4j5\") pod \"85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:47 crc kubenswrapper[4763]: I1201 09:28:47.683698 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:48 crc kubenswrapper[4763]: I1201 09:28:48.242983 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr"] Dec 01 09:28:48 crc kubenswrapper[4763]: I1201 09:28:48.273511 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" event={"ID":"40baea6c-c32e-4f93-b01e-d94c309c05f7","Type":"ContainerStarted","Data":"7270c624b0c98f6996ff1404c6e24c19ccb96d2ae17c20fce1bb87dbcee15dae"} Dec 01 09:28:49 crc kubenswrapper[4763]: I1201 09:28:49.282425 4763 generic.go:334] "Generic (PLEG): container finished" podID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerID="3379a8e0b19bbe862ef1b4a6ec4cb7c7550fae0d5fa76de728149236e5f60390" exitCode=0 Dec 01 09:28:49 crc kubenswrapper[4763]: I1201 09:28:49.282556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" event={"ID":"40baea6c-c32e-4f93-b01e-d94c309c05f7","Type":"ContainerDied","Data":"3379a8e0b19bbe862ef1b4a6ec4cb7c7550fae0d5fa76de728149236e5f60390"} Dec 01 09:28:50 crc kubenswrapper[4763]: I1201 09:28:50.289592 4763 generic.go:334] "Generic (PLEG): container finished" podID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerID="38c22d7f86d5d1d41e63ad39e9dfa243f874b8136f2264d5bcea2546e354931d" exitCode=0 Dec 01 09:28:50 crc kubenswrapper[4763]: I1201 09:28:50.289896 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" event={"ID":"40baea6c-c32e-4f93-b01e-d94c309c05f7","Type":"ContainerDied","Data":"38c22d7f86d5d1d41e63ad39e9dfa243f874b8136f2264d5bcea2546e354931d"} Dec 01 09:28:51 crc kubenswrapper[4763]: I1201 09:28:51.297830 4763 generic.go:334] "Generic (PLEG): container finished" podID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerID="04f9670ee69eadb8bbbfa3be707f6cec6cb6af33da20ae0ec91a3b03583bc61e" exitCode=0 Dec 01 09:28:51 crc kubenswrapper[4763]: I1201 09:28:51.297869 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" event={"ID":"40baea6c-c32e-4f93-b01e-d94c309c05f7","Type":"ContainerDied","Data":"04f9670ee69eadb8bbbfa3be707f6cec6cb6af33da20ae0ec91a3b03583bc61e"} Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.715015 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.876923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-util\") pod \"40baea6c-c32e-4f93-b01e-d94c309c05f7\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.877017 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k4j5\" (UniqueName: \"kubernetes.io/projected/40baea6c-c32e-4f93-b01e-d94c309c05f7-kube-api-access-2k4j5\") pod \"40baea6c-c32e-4f93-b01e-d94c309c05f7\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.877051 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-bundle\") pod \"40baea6c-c32e-4f93-b01e-d94c309c05f7\" (UID: \"40baea6c-c32e-4f93-b01e-d94c309c05f7\") " Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.877882 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-bundle" (OuterVolumeSpecName: "bundle") pod "40baea6c-c32e-4f93-b01e-d94c309c05f7" (UID: "40baea6c-c32e-4f93-b01e-d94c309c05f7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.881810 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40baea6c-c32e-4f93-b01e-d94c309c05f7-kube-api-access-2k4j5" (OuterVolumeSpecName: "kube-api-access-2k4j5") pod "40baea6c-c32e-4f93-b01e-d94c309c05f7" (UID: "40baea6c-c32e-4f93-b01e-d94c309c05f7"). InnerVolumeSpecName "kube-api-access-2k4j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.892544 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-util" (OuterVolumeSpecName: "util") pod "40baea6c-c32e-4f93-b01e-d94c309c05f7" (UID: "40baea6c-c32e-4f93-b01e-d94c309c05f7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.978915 4763 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-util\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.979230 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k4j5\" (UniqueName: \"kubernetes.io/projected/40baea6c-c32e-4f93-b01e-d94c309c05f7-kube-api-access-2k4j5\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:52 crc kubenswrapper[4763]: I1201 09:28:52.979244 4763 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40baea6c-c32e-4f93-b01e-d94c309c05f7-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:28:53 crc kubenswrapper[4763]: I1201 09:28:53.315978 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" event={"ID":"40baea6c-c32e-4f93-b01e-d94c309c05f7","Type":"ContainerDied","Data":"7270c624b0c98f6996ff1404c6e24c19ccb96d2ae17c20fce1bb87dbcee15dae"} Dec 01 09:28:53 crc kubenswrapper[4763]: I1201 09:28:53.316036 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7270c624b0c98f6996ff1404c6e24c19ccb96d2ae17c20fce1bb87dbcee15dae" Dec 01 09:28:53 crc kubenswrapper[4763]: I1201 09:28:53.316075 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.388217 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw"] Dec 01 09:28:59 crc kubenswrapper[4763]: E1201 09:28:59.389746 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerName="util" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.389828 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerName="util" Dec 01 09:28:59 crc kubenswrapper[4763]: E1201 09:28:59.389885 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerName="extract" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.389937 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerName="extract" Dec 01 09:28:59 crc kubenswrapper[4763]: E1201 09:28:59.390003 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerName="pull" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.390060 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerName="pull" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.390226 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="40baea6c-c32e-4f93-b01e-d94c309c05f7" containerName="extract" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.390781 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.394404 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-hms8h" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.427769 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw"] Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.575350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4swx\" (UniqueName: \"kubernetes.io/projected/de261a18-aec0-4ea5-aaf9-e313631599e6-kube-api-access-w4swx\") pod \"openstack-operator-controller-operator-5794fdf75-8f5zw\" (UID: \"de261a18-aec0-4ea5-aaf9-e313631599e6\") " pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.676915 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4swx\" (UniqueName: \"kubernetes.io/projected/de261a18-aec0-4ea5-aaf9-e313631599e6-kube-api-access-w4swx\") pod \"openstack-operator-controller-operator-5794fdf75-8f5zw\" (UID: \"de261a18-aec0-4ea5-aaf9-e313631599e6\") " pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.695834 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4swx\" (UniqueName: \"kubernetes.io/projected/de261a18-aec0-4ea5-aaf9-e313631599e6-kube-api-access-w4swx\") pod \"openstack-operator-controller-operator-5794fdf75-8f5zw\" (UID: \"de261a18-aec0-4ea5-aaf9-e313631599e6\") " pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.711720 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" Dec 01 09:28:59 crc kubenswrapper[4763]: I1201 09:28:59.946199 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw"] Dec 01 09:29:00 crc kubenswrapper[4763]: I1201 09:29:00.369976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" event={"ID":"de261a18-aec0-4ea5-aaf9-e313631599e6","Type":"ContainerStarted","Data":"7a2373d373abc71085dcd02f7a0dfc0f920ea6bf7725a12c8aa0fe7f1cf2fdd8"} Dec 01 09:29:10 crc kubenswrapper[4763]: I1201 09:29:10.553311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" event={"ID":"de261a18-aec0-4ea5-aaf9-e313631599e6","Type":"ContainerStarted","Data":"11028361cb942552a61ace2f4920431f2a35bb56c68aaa555fa7f340e5bd668c"} Dec 01 09:29:10 crc kubenswrapper[4763]: I1201 09:29:10.553928 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" Dec 01 09:29:10 crc kubenswrapper[4763]: I1201 09:29:10.588185 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" podStartSLOduration=2.099476208 podStartE2EDuration="11.588165707s" podCreationTimestamp="2025-12-01 09:28:59 +0000 UTC" firstStartedPulling="2025-12-01 09:28:59.965324146 +0000 UTC m=+857.233972914" lastFinishedPulling="2025-12-01 09:29:09.454013645 +0000 UTC m=+866.722662413" observedRunningTime="2025-12-01 09:29:10.581835704 +0000 UTC m=+867.850484502" watchObservedRunningTime="2025-12-01 09:29:10.588165707 +0000 UTC m=+867.856814475" Dec 01 09:29:19 crc kubenswrapper[4763]: I1201 09:29:19.713891 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5794fdf75-8f5zw" Dec 01 09:29:34 crc kubenswrapper[4763]: I1201 09:29:34.850085 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wwwr5"] Dec 01 09:29:34 crc kubenswrapper[4763]: I1201 09:29:34.852071 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:34 crc kubenswrapper[4763]: I1201 09:29:34.870479 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwwr5"] Dec 01 09:29:34 crc kubenswrapper[4763]: I1201 09:29:34.956656 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-catalog-content\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:34 crc kubenswrapper[4763]: I1201 09:29:34.956728 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr9sj\" (UniqueName: \"kubernetes.io/projected/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-kube-api-access-mr9sj\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:34 crc kubenswrapper[4763]: I1201 09:29:34.956778 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-utilities\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:35 crc kubenswrapper[4763]: I1201 09:29:35.057669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr9sj\" (UniqueName: \"kubernetes.io/projected/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-kube-api-access-mr9sj\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:35 crc kubenswrapper[4763]: I1201 09:29:35.057723 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-utilities\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:35 crc kubenswrapper[4763]: I1201 09:29:35.057827 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-catalog-content\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:35 crc kubenswrapper[4763]: I1201 09:29:35.058259 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-utilities\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:35 crc kubenswrapper[4763]: I1201 09:29:35.058352 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-catalog-content\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:35 crc kubenswrapper[4763]: I1201 09:29:35.084276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr9sj\" (UniqueName: \"kubernetes.io/projected/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-kube-api-access-mr9sj\") pod \"redhat-marketplace-wwwr5\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:35 crc kubenswrapper[4763]: I1201 09:29:35.173586 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:35 crc kubenswrapper[4763]: I1201 09:29:35.798765 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwwr5"] Dec 01 09:29:36 crc kubenswrapper[4763]: I1201 09:29:36.821039 4763 generic.go:334] "Generic (PLEG): container finished" podID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerID="762c98ec1c325d1e7669471771cc45c98d664048b14c70ba7dcd5e518d8fd19a" exitCode=0 Dec 01 09:29:36 crc kubenswrapper[4763]: I1201 09:29:36.821129 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwwr5" event={"ID":"b6138a92-1fe1-48a0-8178-210dfe3ea1d0","Type":"ContainerDied","Data":"762c98ec1c325d1e7669471771cc45c98d664048b14c70ba7dcd5e518d8fd19a"} Dec 01 09:29:36 crc kubenswrapper[4763]: I1201 09:29:36.821392 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwwr5" event={"ID":"b6138a92-1fe1-48a0-8178-210dfe3ea1d0","Type":"ContainerStarted","Data":"eea3c738dbafec928c7d22fb6af5311ae29d6d7a83631d6f95873471faed09c6"} Dec 01 09:29:37 crc kubenswrapper[4763]: I1201 09:29:37.831685 4763 generic.go:334] "Generic (PLEG): container finished" podID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerID="1f302ab97c731d6ca027be40a4d510bad87a0b829dd8e525ce2e4b4091292212" exitCode=0 Dec 01 09:29:37 crc kubenswrapper[4763]: I1201 09:29:37.832861 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwwr5" event={"ID":"b6138a92-1fe1-48a0-8178-210dfe3ea1d0","Type":"ContainerDied","Data":"1f302ab97c731d6ca027be40a4d510bad87a0b829dd8e525ce2e4b4091292212"} Dec 01 09:29:38 crc kubenswrapper[4763]: I1201 09:29:38.839387 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwwr5" event={"ID":"b6138a92-1fe1-48a0-8178-210dfe3ea1d0","Type":"ContainerStarted","Data":"25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9"} Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.102256 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wwwr5" podStartSLOduration=3.554109258 podStartE2EDuration="5.102228192s" podCreationTimestamp="2025-12-01 09:29:34 +0000 UTC" firstStartedPulling="2025-12-01 09:29:36.823257525 +0000 UTC m=+894.091906293" lastFinishedPulling="2025-12-01 09:29:38.371376459 +0000 UTC m=+895.640025227" observedRunningTime="2025-12-01 09:29:38.862867871 +0000 UTC m=+896.131516639" watchObservedRunningTime="2025-12-01 09:29:39.102228192 +0000 UTC m=+896.370876960" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.104768 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.106377 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.110968 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-gdnbg" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.121149 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.122212 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.122922 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f274\" (UniqueName: \"kubernetes.io/projected/1c980f4b-c55c-4a2e-9461-9f89ec0165c3-kube-api-access-5f274\") pod \"cinder-operator-controller-manager-859b6ccc6-nh5mz\" (UID: \"1c980f4b-c55c-4a2e-9461-9f89ec0165c3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.122999 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnj6j\" (UniqueName: \"kubernetes.io/projected/1208b653-3551-4266-99b9-e83fb86b4771-kube-api-access-lnj6j\") pod \"barbican-operator-controller-manager-7d9dfd778-qh7dr\" (UID: \"1208b653-3551-4266-99b9-e83fb86b4771\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.126768 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xlwz9" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.139164 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.161935 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.174055 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.174989 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.183435 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-knjfq" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.217869 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.224610 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnj6j\" (UniqueName: \"kubernetes.io/projected/1208b653-3551-4266-99b9-e83fb86b4771-kube-api-access-lnj6j\") pod \"barbican-operator-controller-manager-7d9dfd778-qh7dr\" (UID: \"1208b653-3551-4266-99b9-e83fb86b4771\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.224701 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f274\" (UniqueName: \"kubernetes.io/projected/1c980f4b-c55c-4a2e-9461-9f89ec0165c3-kube-api-access-5f274\") pod \"cinder-operator-controller-manager-859b6ccc6-nh5mz\" (UID: \"1c980f4b-c55c-4a2e-9461-9f89ec0165c3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.224723 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9s58\" (UniqueName: \"kubernetes.io/projected/ee716572-1b36-4216-84a6-ad3f4ac2b7f6-kube-api-access-v9s58\") pod \"designate-operator-controller-manager-78b4bc895b-l547d\" (UID: \"ee716572-1b36-4216-84a6-ad3f4ac2b7f6\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.225650 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.227017 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.232243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-4z4sc" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.265411 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.292722 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f274\" (UniqueName: \"kubernetes.io/projected/1c980f4b-c55c-4a2e-9461-9f89ec0165c3-kube-api-access-5f274\") pod \"cinder-operator-controller-manager-859b6ccc6-nh5mz\" (UID: \"1c980f4b-c55c-4a2e-9461-9f89ec0165c3\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.300157 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnj6j\" (UniqueName: \"kubernetes.io/projected/1208b653-3551-4266-99b9-e83fb86b4771-kube-api-access-lnj6j\") pod \"barbican-operator-controller-manager-7d9dfd778-qh7dr\" (UID: \"1208b653-3551-4266-99b9-e83fb86b4771\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.305053 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.315707 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.322008 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.326750 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-g7m6c" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.328104 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9s58\" (UniqueName: \"kubernetes.io/projected/ee716572-1b36-4216-84a6-ad3f4ac2b7f6-kube-api-access-v9s58\") pod \"designate-operator-controller-manager-78b4bc895b-l547d\" (UID: \"ee716572-1b36-4216-84a6-ad3f4ac2b7f6\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.371051 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9s58\" (UniqueName: \"kubernetes.io/projected/ee716572-1b36-4216-84a6-ad3f4ac2b7f6-kube-api-access-v9s58\") pod \"designate-operator-controller-manager-78b4bc895b-l547d\" (UID: \"ee716572-1b36-4216-84a6-ad3f4ac2b7f6\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.414533 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.415911 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.427604 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-4rsq2" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.427792 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.429285 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gncqm\" (UniqueName: \"kubernetes.io/projected/5325eff2-4650-499e-9cad-f486bae74fce-kube-api-access-gncqm\") pod \"glance-operator-controller-manager-668d9c48b9-xx8m2\" (UID: \"5325eff2-4650-499e-9cad-f486bae74fce\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.429354 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nstmp\" (UniqueName: \"kubernetes.io/projected/eae2e950-9f81-49cc-926e-380b81a0f0e7-kube-api-access-nstmp\") pod \"heat-operator-controller-manager-5f64f6f8bb-rjzdx\" (UID: \"eae2e950-9f81-49cc-926e-380b81a0f0e7\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.438209 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.440989 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.473693 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.477494 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.499413 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.499817 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.510902 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hv47l" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.529616 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.530297 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwwm8\" (UniqueName: \"kubernetes.io/projected/1e77763b-639f-46aa-a798-e39251aa8636-kube-api-access-jwwm8\") pod \"horizon-operator-controller-manager-68c6d99b8f-dbhs6\" (UID: \"1e77763b-639f-46aa-a798-e39251aa8636\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.530366 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nstmp\" (UniqueName: \"kubernetes.io/projected/eae2e950-9f81-49cc-926e-380b81a0f0e7-kube-api-access-nstmp\") pod \"heat-operator-controller-manager-5f64f6f8bb-rjzdx\" (UID: \"eae2e950-9f81-49cc-926e-380b81a0f0e7\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.530426 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gncqm\" (UniqueName: \"kubernetes.io/projected/5325eff2-4650-499e-9cad-f486bae74fce-kube-api-access-gncqm\") pod \"glance-operator-controller-manager-668d9c48b9-xx8m2\" (UID: \"5325eff2-4650-499e-9cad-f486bae74fce\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.553963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gncqm\" (UniqueName: \"kubernetes.io/projected/5325eff2-4650-499e-9cad-f486bae74fce-kube-api-access-gncqm\") pod \"glance-operator-controller-manager-668d9c48b9-xx8m2\" (UID: \"5325eff2-4650-499e-9cad-f486bae74fce\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.579663 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.581294 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.590785 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mltrc" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.619335 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.633510 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwwm8\" (UniqueName: \"kubernetes.io/projected/1e77763b-639f-46aa-a798-e39251aa8636-kube-api-access-jwwm8\") pod \"horizon-operator-controller-manager-68c6d99b8f-dbhs6\" (UID: \"1e77763b-639f-46aa-a798-e39251aa8636\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.633578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65tnz\" (UniqueName: \"kubernetes.io/projected/2aecf763-7a48-4c6f-a66c-ea391befd47a-kube-api-access-65tnz\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.633665 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.637334 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.638819 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.640700 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nstmp\" (UniqueName: \"kubernetes.io/projected/eae2e950-9f81-49cc-926e-380b81a0f0e7-kube-api-access-nstmp\") pod \"heat-operator-controller-manager-5f64f6f8bb-rjzdx\" (UID: \"eae2e950-9f81-49cc-926e-380b81a0f0e7\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.650702 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-d6s8l" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.659528 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.660835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.666637 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-8pjqs" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.683575 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.692565 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwwm8\" (UniqueName: \"kubernetes.io/projected/1e77763b-639f-46aa-a798-e39251aa8636-kube-api-access-jwwm8\") pod \"horizon-operator-controller-manager-68c6d99b8f-dbhs6\" (UID: \"1e77763b-639f-46aa-a798-e39251aa8636\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.714021 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.728375 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.733812 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.734811 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.735818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.735861 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65tnz\" (UniqueName: \"kubernetes.io/projected/2aecf763-7a48-4c6f-a66c-ea391befd47a-kube-api-access-65tnz\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.735881 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlbqs\" (UniqueName: \"kubernetes.io/projected/c0ed7161-2907-48a6-894d-c6e3a1f47e0e-kube-api-access-rlbqs\") pod \"ironic-operator-controller-manager-6c548fd776-4srbr\" (UID: \"c0ed7161-2907-48a6-894d-c6e3a1f47e0e\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.735912 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl6k9\" (UniqueName: \"kubernetes.io/projected/f5580ab2-73e2-4766-8e9c-f217fd4c079d-kube-api-access-rl6k9\") pod \"keystone-operator-controller-manager-546d4bdf48-t74fr\" (UID: \"f5580ab2-73e2-4766-8e9c-f217fd4c079d\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" Dec 01 09:29:39 crc kubenswrapper[4763]: E1201 09:29:39.736025 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:39 crc kubenswrapper[4763]: E1201 09:29:39.736063 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert podName:2aecf763-7a48-4c6f-a66c-ea391befd47a nodeName:}" failed. No retries permitted until 2025-12-01 09:29:40.236049086 +0000 UTC m=+897.504697854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert") pod "infra-operator-controller-manager-57548d458d-9vjrk" (UID: "2aecf763-7a48-4c6f-a66c-ea391befd47a") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.738786 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-mptp5" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.738985 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.785530 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.786514 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.787373 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.788210 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.788946 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.789216 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.803242 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.803319 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.823873 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.823937 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.824968 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.829738 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.838075 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.839226 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.843428 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlbqs\" (UniqueName: \"kubernetes.io/projected/c0ed7161-2907-48a6-894d-c6e3a1f47e0e-kube-api-access-rlbqs\") pod \"ironic-operator-controller-manager-6c548fd776-4srbr\" (UID: \"c0ed7161-2907-48a6-894d-c6e3a1f47e0e\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.843507 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjdcg\" (UniqueName: \"kubernetes.io/projected/4e44f450-61c5-4f49-b16b-8c9e0f060879-kube-api-access-bjdcg\") pod \"manila-operator-controller-manager-6546668bfd-9x2q4\" (UID: \"4e44f450-61c5-4f49-b16b-8c9e0f060879\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.843545 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl6k9\" (UniqueName: \"kubernetes.io/projected/f5580ab2-73e2-4766-8e9c-f217fd4c079d-kube-api-access-rl6k9\") pod \"keystone-operator-controller-manager-546d4bdf48-t74fr\" (UID: \"f5580ab2-73e2-4766-8e9c-f217fd4c079d\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.843582 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9sxk\" (UniqueName: \"kubernetes.io/projected/9eef10a1-bfcc-412c-9687-fee23d90d448-kube-api-access-l9sxk\") pod \"mariadb-operator-controller-manager-56bbcc9d85-nb5kc\" (UID: \"9eef10a1-bfcc-412c-9687-fee23d90d448\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.853164 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.853208 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk"] Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.863749 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.905067 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.905296 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-lhtgk" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.905444 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4ltnf" Dec 01 09:29:39 crc kubenswrapper[4763]: I1201 09:29:39.905657 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-rnr9l" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.004928 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4ffrp" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.005227 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wrqwx" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.013018 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.016921 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl6k9\" (UniqueName: \"kubernetes.io/projected/f5580ab2-73e2-4766-8e9c-f217fd4c079d-kube-api-access-rl6k9\") pod \"keystone-operator-controller-manager-546d4bdf48-t74fr\" (UID: \"f5580ab2-73e2-4766-8e9c-f217fd4c079d\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.020411 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.021261 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9v6b\" (UniqueName: \"kubernetes.io/projected/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-kube-api-access-d9v6b\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.021306 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjdcg\" (UniqueName: \"kubernetes.io/projected/4e44f450-61c5-4f49-b16b-8c9e0f060879-kube-api-access-bjdcg\") pod \"manila-operator-controller-manager-6546668bfd-9x2q4\" (UID: \"4e44f450-61c5-4f49-b16b-8c9e0f060879\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.021327 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w2cx\" (UniqueName: \"kubernetes.io/projected/c274cd4c-0c77-485c-8d8f-116a2f7b013b-kube-api-access-2w2cx\") pod \"ovn-operator-controller-manager-b6456fdb6-k2nzk\" (UID: \"c274cd4c-0c77-485c-8d8f-116a2f7b013b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.021363 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptgbz\" (UniqueName: \"kubernetes.io/projected/ddef7d32-1d4c-496d-be36-7ae7af64205a-kube-api-access-ptgbz\") pod \"octavia-operator-controller-manager-998648c74-jbvjc\" (UID: \"ddef7d32-1d4c-496d-be36-7ae7af64205a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.021385 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcgkn\" (UniqueName: \"kubernetes.io/projected/a0713966-4e10-4b8b-84bc-6560d1b1bf5a-kube-api-access-gcgkn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-9f5r9\" (UID: \"a0713966-4e10-4b8b-84bc-6560d1b1bf5a\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.021406 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9sxk\" (UniqueName: \"kubernetes.io/projected/9eef10a1-bfcc-412c-9687-fee23d90d448-kube-api-access-l9sxk\") pod \"mariadb-operator-controller-manager-56bbcc9d85-nb5kc\" (UID: \"9eef10a1-bfcc-412c-9687-fee23d90d448\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.021499 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.021536 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bptt\" (UniqueName: \"kubernetes.io/projected/0ee4b811-c59a-4120-bf78-53fe9e049d4b-kube-api-access-8bptt\") pod \"nova-operator-controller-manager-697bc559fc-jxvhd\" (UID: \"0ee4b811-c59a-4120-bf78-53fe9e049d4b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.022980 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65tnz\" (UniqueName: \"kubernetes.io/projected/2aecf763-7a48-4c6f-a66c-ea391befd47a-kube-api-access-65tnz\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.034067 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fwfvl" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.054124 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlbqs\" (UniqueName: \"kubernetes.io/projected/c0ed7161-2907-48a6-894d-c6e3a1f47e0e-kube-api-access-rlbqs\") pod \"ironic-operator-controller-manager-6c548fd776-4srbr\" (UID: \"c0ed7161-2907-48a6-894d-c6e3a1f47e0e\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.054907 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9sxk\" (UniqueName: \"kubernetes.io/projected/9eef10a1-bfcc-412c-9687-fee23d90d448-kube-api-access-l9sxk\") pod \"mariadb-operator-controller-manager-56bbcc9d85-nb5kc\" (UID: \"9eef10a1-bfcc-412c-9687-fee23d90d448\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.068152 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.081099 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjdcg\" (UniqueName: \"kubernetes.io/projected/4e44f450-61c5-4f49-b16b-8c9e0f060879-kube-api-access-bjdcg\") pod \"manila-operator-controller-manager-6546668bfd-9x2q4\" (UID: \"4e44f450-61c5-4f49-b16b-8c9e0f060879\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.123817 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9v6b\" (UniqueName: \"kubernetes.io/projected/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-kube-api-access-d9v6b\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.124345 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w2cx\" (UniqueName: \"kubernetes.io/projected/c274cd4c-0c77-485c-8d8f-116a2f7b013b-kube-api-access-2w2cx\") pod \"ovn-operator-controller-manager-b6456fdb6-k2nzk\" (UID: \"c274cd4c-0c77-485c-8d8f-116a2f7b013b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.124394 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptgbz\" (UniqueName: \"kubernetes.io/projected/ddef7d32-1d4c-496d-be36-7ae7af64205a-kube-api-access-ptgbz\") pod \"octavia-operator-controller-manager-998648c74-jbvjc\" (UID: \"ddef7d32-1d4c-496d-be36-7ae7af64205a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.124426 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgkn\" (UniqueName: \"kubernetes.io/projected/a0713966-4e10-4b8b-84bc-6560d1b1bf5a-kube-api-access-gcgkn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-9f5r9\" (UID: \"a0713966-4e10-4b8b-84bc-6560d1b1bf5a\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.134269 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.135895 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.135996 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bptt\" (UniqueName: \"kubernetes.io/projected/0ee4b811-c59a-4120-bf78-53fe9e049d4b-kube-api-access-8bptt\") pod \"nova-operator-controller-manager-697bc559fc-jxvhd\" (UID: \"0ee4b811-c59a-4120-bf78-53fe9e049d4b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.141713 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.141783 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert podName:e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:40.641767447 +0000 UTC m=+897.910416215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" (UID: "e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.150760 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.170848 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.191714 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.234217 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-754ln" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.239041 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgkn\" (UniqueName: \"kubernetes.io/projected/a0713966-4e10-4b8b-84bc-6560d1b1bf5a-kube-api-access-gcgkn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-9f5r9\" (UID: \"a0713966-4e10-4b8b-84bc-6560d1b1bf5a\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.318691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w2cx\" (UniqueName: \"kubernetes.io/projected/c274cd4c-0c77-485c-8d8f-116a2f7b013b-kube-api-access-2w2cx\") pod \"ovn-operator-controller-manager-b6456fdb6-k2nzk\" (UID: \"c274cd4c-0c77-485c-8d8f-116a2f7b013b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.322752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nskj6\" (UniqueName: \"kubernetes.io/projected/13a2ac2b-0374-4da0-abbf-6aecbc3afbb8-kube-api-access-nskj6\") pod \"placement-operator-controller-manager-78f8948974-tk5xf\" (UID: \"13a2ac2b-0374-4da0-abbf-6aecbc3afbb8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.328706 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.330014 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bptt\" (UniqueName: \"kubernetes.io/projected/0ee4b811-c59a-4120-bf78-53fe9e049d4b-kube-api-access-8bptt\") pod \"nova-operator-controller-manager-697bc559fc-jxvhd\" (UID: \"0ee4b811-c59a-4120-bf78-53fe9e049d4b\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.331133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.332111 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.332266 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert podName:2aecf763-7a48-4c6f-a66c-ea391befd47a nodeName:}" failed. No retries permitted until 2025-12-01 09:29:41.332239978 +0000 UTC m=+898.600888756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert") pod "infra-operator-controller-manager-57548d458d-9vjrk" (UID: "2aecf763-7a48-4c6f-a66c-ea391befd47a") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.338838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptgbz\" (UniqueName: \"kubernetes.io/projected/ddef7d32-1d4c-496d-be36-7ae7af64205a-kube-api-access-ptgbz\") pod \"octavia-operator-controller-manager-998648c74-jbvjc\" (UID: \"ddef7d32-1d4c-496d-be36-7ae7af64205a\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.369767 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.374641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9v6b\" (UniqueName: \"kubernetes.io/projected/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-kube-api-access-d9v6b\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.412952 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.414352 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.417767 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.418703 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-487fh" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.433822 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nskj6\" (UniqueName: \"kubernetes.io/projected/13a2ac2b-0374-4da0-abbf-6aecbc3afbb8-kube-api-access-nskj6\") pod \"placement-operator-controller-manager-78f8948974-tk5xf\" (UID: \"13a2ac2b-0374-4da0-abbf-6aecbc3afbb8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.433870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcr66\" (UniqueName: \"kubernetes.io/projected/20d8f0e3-8406-4e55-adbf-0681e090a82e-kube-api-access-jcr66\") pod \"swift-operator-controller-manager-5f8c65bbfc-mqvfc\" (UID: \"20d8f0e3-8406-4e55-adbf-0681e090a82e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.435168 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.468782 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-64k9k"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.469869 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.487513 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.503689 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-97v2j" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.532284 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.533340 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.538168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nskj6\" (UniqueName: \"kubernetes.io/projected/13a2ac2b-0374-4da0-abbf-6aecbc3afbb8-kube-api-access-nskj6\") pod \"placement-operator-controller-manager-78f8948974-tk5xf\" (UID: \"13a2ac2b-0374-4da0-abbf-6aecbc3afbb8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.550496 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qswkr\" (UniqueName: \"kubernetes.io/projected/89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0-kube-api-access-qswkr\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2skdl\" (UID: \"89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.550628 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkq5r\" (UniqueName: \"kubernetes.io/projected/664dabb5-40f4-44c4-be9d-1870e153c877-kube-api-access-qkq5r\") pod \"test-operator-controller-manager-5854674fcc-64k9k\" (UID: \"664dabb5-40f4-44c4-be9d-1870e153c877\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.550698 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp5dt\" (UniqueName: \"kubernetes.io/projected/3f494774-a168-4199-bfff-e73f64a669cf-kube-api-access-vp5dt\") pod \"watcher-operator-controller-manager-769dc69bc-ctr5d\" (UID: \"3f494774-a168-4199-bfff-e73f64a669cf\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.550735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcr66\" (UniqueName: \"kubernetes.io/projected/20d8f0e3-8406-4e55-adbf-0681e090a82e-kube-api-access-jcr66\") pod \"swift-operator-controller-manager-5f8c65bbfc-mqvfc\" (UID: \"20d8f0e3-8406-4e55-adbf-0681e090a82e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.550749 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-hbcbk" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.552243 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.556797 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-64k9k"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.655398 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.655481 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qswkr\" (UniqueName: \"kubernetes.io/projected/89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0-kube-api-access-qswkr\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2skdl\" (UID: \"89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.655502 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkq5r\" (UniqueName: \"kubernetes.io/projected/664dabb5-40f4-44c4-be9d-1870e153c877-kube-api-access-qkq5r\") pod \"test-operator-controller-manager-5854674fcc-64k9k\" (UID: \"664dabb5-40f4-44c4-be9d-1870e153c877\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.655526 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp5dt\" (UniqueName: \"kubernetes.io/projected/3f494774-a168-4199-bfff-e73f64a669cf-kube-api-access-vp5dt\") pod \"watcher-operator-controller-manager-769dc69bc-ctr5d\" (UID: \"3f494774-a168-4199-bfff-e73f64a669cf\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.655875 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.655925 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert podName:e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:41.655908617 +0000 UTC m=+898.924557395 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" (UID: "e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.656047 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.666953 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.762835 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.763801 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.772801 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.776832 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-htdmn" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.780232 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkq5r\" (UniqueName: \"kubernetes.io/projected/664dabb5-40f4-44c4-be9d-1870e153c877-kube-api-access-qkq5r\") pod \"test-operator-controller-manager-5854674fcc-64k9k\" (UID: \"664dabb5-40f4-44c4-be9d-1870e153c877\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.781286 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.807515 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp5dt\" (UniqueName: \"kubernetes.io/projected/3f494774-a168-4199-bfff-e73f64a669cf-kube-api-access-vp5dt\") pod \"watcher-operator-controller-manager-769dc69bc-ctr5d\" (UID: \"3f494774-a168-4199-bfff-e73f64a669cf\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.813267 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcr66\" (UniqueName: \"kubernetes.io/projected/20d8f0e3-8406-4e55-adbf-0681e090a82e-kube-api-access-jcr66\") pod \"swift-operator-controller-manager-5f8c65bbfc-mqvfc\" (UID: \"20d8f0e3-8406-4e55-adbf-0681e090a82e\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.814209 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qswkr\" (UniqueName: \"kubernetes.io/projected/89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0-kube-api-access-qswkr\") pod \"telemetry-operator-controller-manager-76cc84c6bb-2skdl\" (UID: \"89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.832992 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.850004 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.861321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.861363 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx284\" (UniqueName: \"kubernetes.io/projected/dd971a72-ce63-45cb-9457-43fcea25f677-kube-api-access-fx284\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.861406 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.874443 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.910122 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.923949 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dmb2k"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.925343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.964265 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dmb2k"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.964969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.964991 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-utilities\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.965016 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx284\" (UniqueName: \"kubernetes.io/projected/dd971a72-ce63-45cb-9457-43fcea25f677-kube-api-access-fx284\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.965059 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.965105 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rzk\" (UniqueName: \"kubernetes.io/projected/96e0bbed-711d-48ca-bf8c-fa678c91c3de-kube-api-access-j8rzk\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.965127 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-catalog-content\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.965262 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.965303 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:41.46528923 +0000 UTC m=+898.733937998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "metrics-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.965955 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: E1201 09:29:40.965979 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:41.46597119 +0000 UTC m=+898.734619958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "webhook-server-cert" not found Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.974944 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp"] Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.976038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.979126 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-jsg78" Dec 01 09:29:40 crc kubenswrapper[4763]: I1201 09:29:40.980590 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.109547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-catalog-content\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.109647 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dszlc\" (UniqueName: \"kubernetes.io/projected/6b7d748f-a9e4-416a-8fd7-9fa46ca2060d-kube-api-access-dszlc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s4rlp\" (UID: \"6b7d748f-a9e4-416a-8fd7-9fa46ca2060d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.109745 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-utilities\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.109842 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rzk\" (UniqueName: \"kubernetes.io/projected/96e0bbed-711d-48ca-bf8c-fa678c91c3de-kube-api-access-j8rzk\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.110790 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-catalog-content\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.111280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-utilities\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.112589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.135486 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rzk\" (UniqueName: \"kubernetes.io/projected/96e0bbed-711d-48ca-bf8c-fa678c91c3de-kube-api-access-j8rzk\") pod \"certified-operators-dmb2k\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.167633 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.184530 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx284\" (UniqueName: \"kubernetes.io/projected/dd971a72-ce63-45cb-9457-43fcea25f677-kube-api-access-fx284\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.186769 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp"] Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.216881 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dszlc\" (UniqueName: \"kubernetes.io/projected/6b7d748f-a9e4-416a-8fd7-9fa46ca2060d-kube-api-access-dszlc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s4rlp\" (UID: \"6b7d748f-a9e4-416a-8fd7-9fa46ca2060d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.226378 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.259080 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dszlc\" (UniqueName: \"kubernetes.io/projected/6b7d748f-a9e4-416a-8fd7-9fa46ca2060d-kube-api-access-dszlc\") pod \"rabbitmq-cluster-operator-manager-668c99d594-s4rlp\" (UID: \"6b7d748f-a9e4-416a-8fd7-9fa46ca2060d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.341479 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz"] Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.411265 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d"] Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.411681 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr"] Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.433965 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:41 crc kubenswrapper[4763]: E1201 09:29:41.434183 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:41 crc kubenswrapper[4763]: E1201 09:29:41.434252 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert podName:2aecf763-7a48-4c6f-a66c-ea391befd47a nodeName:}" failed. No retries permitted until 2025-12-01 09:29:43.434235917 +0000 UTC m=+900.702884685 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert") pod "infra-operator-controller-manager-57548d458d-9vjrk" (UID: "2aecf763-7a48-4c6f-a66c-ea391befd47a") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:41 crc kubenswrapper[4763]: W1201 09:29:41.500342 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c980f4b_c55c_4a2e_9461_9f89ec0165c3.slice/crio-e393d38e309b32717c1b37dcad5a12850cc57e6d0bc56171a5a8bba83e999b71 WatchSource:0}: Error finding container e393d38e309b32717c1b37dcad5a12850cc57e6d0bc56171a5a8bba83e999b71: Status 404 returned error can't find the container with id e393d38e309b32717c1b37dcad5a12850cc57e6d0bc56171a5a8bba83e999b71 Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.594856 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.595816 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.595876 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:41 crc kubenswrapper[4763]: E1201 09:29:41.596036 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:29:41 crc kubenswrapper[4763]: E1201 09:29:41.596086 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:42.596069026 +0000 UTC m=+899.864717804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "webhook-server-cert" not found Dec 01 09:29:41 crc kubenswrapper[4763]: E1201 09:29:41.596551 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:29:41 crc kubenswrapper[4763]: E1201 09:29:41.596596 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:42.596585321 +0000 UTC m=+899.865234089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "metrics-server-cert" not found Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.699243 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:41 crc kubenswrapper[4763]: E1201 09:29:41.699495 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:41 crc kubenswrapper[4763]: E1201 09:29:41.699543 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert podName:e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:43.69952787 +0000 UTC m=+900.968176638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" (UID: "e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:41 crc kubenswrapper[4763]: I1201 09:29:41.903614 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.038831 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.139718 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" event={"ID":"ee716572-1b36-4216-84a6-ad3f4ac2b7f6","Type":"ContainerStarted","Data":"39be0f6808346d3c56ba495aba520854d4db0286eff472f2f2b605a0ccc56236"} Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.142744 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" event={"ID":"eae2e950-9f81-49cc-926e-380b81a0f0e7","Type":"ContainerStarted","Data":"9bb043fb08a2dff338dc1ce607306034454ab2949c8902400e130612abe368c0"} Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.152704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" event={"ID":"1c980f4b-c55c-4a2e-9461-9f89ec0165c3","Type":"ContainerStarted","Data":"e393d38e309b32717c1b37dcad5a12850cc57e6d0bc56171a5a8bba83e999b71"} Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.158168 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" event={"ID":"1208b653-3551-4266-99b9-e83fb86b4771","Type":"ContainerStarted","Data":"ef39d2d8c8d94b340e1cbddc51f1f2f6f5cbc8b7ca9d72be2ff224672c3a971f"} Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.161580 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" event={"ID":"5325eff2-4650-499e-9cad-f486bae74fce","Type":"ContainerStarted","Data":"ab146def9fa72f8450984823a72698e405b084368a3a307c7e5e885f360521ff"} Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.289150 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.367679 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.435790 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.479520 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.526109 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.533254 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.558595 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.589950 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.633239 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.633366 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:42 crc kubenswrapper[4763]: E1201 09:29:42.633505 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:29:42 crc kubenswrapper[4763]: E1201 09:29:42.633553 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:44.633539701 +0000 UTC m=+901.902188459 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "metrics-server-cert" not found Dec 01 09:29:42 crc kubenswrapper[4763]: E1201 09:29:42.633595 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:29:42 crc kubenswrapper[4763]: E1201 09:29:42.633614 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:44.633608473 +0000 UTC m=+901.902257231 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "webhook-server-cert" not found Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.809237 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7hjvw"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.811042 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.831119 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hjvw"] Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.835230 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-utilities\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.835273 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-catalog-content\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.835345 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr6bt\" (UniqueName: \"kubernetes.io/projected/c11aee5c-47cc-4313-9380-0146961c94b8-kube-api-access-vr6bt\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.936314 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-utilities\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.936372 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-catalog-content\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.936489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr6bt\" (UniqueName: \"kubernetes.io/projected/c11aee5c-47cc-4313-9380-0146961c94b8-kube-api-access-vr6bt\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.937342 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-utilities\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.937398 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-catalog-content\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.953908 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd"] Dec 01 09:29:42 crc kubenswrapper[4763]: W1201 09:29:42.964523 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee4b811_c59a_4120_bf78_53fe9e049d4b.slice/crio-69eb7cd6d345b54fdd461adcc746558ee1cdd5417be79977b84caaffc0b30398 WatchSource:0}: Error finding container 69eb7cd6d345b54fdd461adcc746558ee1cdd5417be79977b84caaffc0b30398: Status 404 returned error can't find the container with id 69eb7cd6d345b54fdd461adcc746558ee1cdd5417be79977b84caaffc0b30398 Dec 01 09:29:42 crc kubenswrapper[4763]: I1201 09:29:42.966295 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr6bt\" (UniqueName: \"kubernetes.io/projected/c11aee5c-47cc-4313-9380-0146961c94b8-kube-api-access-vr6bt\") pod \"community-operators-7hjvw\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.100114 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf"] Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.102732 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc"] Dec 01 09:29:43 crc kubenswrapper[4763]: W1201 09:29:43.140374 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20d8f0e3_8406_4e55_adbf_0681e090a82e.slice/crio-9a75e0e5683cabcd19266859016ceffd5f8a921a8399ca823b487c4456fa127b WatchSource:0}: Error finding container 9a75e0e5683cabcd19266859016ceffd5f8a921a8399ca823b487c4456fa127b: Status 404 returned error can't find the container with id 9a75e0e5683cabcd19266859016ceffd5f8a921a8399ca823b487c4456fa127b Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.167697 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.230933 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl"] Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.247252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" event={"ID":"a0713966-4e10-4b8b-84bc-6560d1b1bf5a","Type":"ContainerStarted","Data":"1da7fdb2ca1a16c95249a1680f5a07709187017af3e45029fba5103831742677"} Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.270643 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dmb2k"] Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.270681 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" event={"ID":"ddef7d32-1d4c-496d-be36-7ae7af64205a","Type":"ContainerStarted","Data":"21e2b5e018486a22112b3b9ac1a3eb0ae9a455d2d8343716031e6b675043341b"} Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.272951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" event={"ID":"9eef10a1-bfcc-412c-9687-fee23d90d448","Type":"ContainerStarted","Data":"bcd90b3014d739061eb1dbae7712cd895faa1d621ff5ed27854d865f34e05eae"} Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.281232 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" event={"ID":"c274cd4c-0c77-485c-8d8f-116a2f7b013b","Type":"ContainerStarted","Data":"f8d34fb262097b24bdaa5df767139a70aff8f22198c61ba18c0f2b0396e3a21c"} Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.301853 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-64k9k"] Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.312611 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" event={"ID":"13a2ac2b-0374-4da0-abbf-6aecbc3afbb8","Type":"ContainerStarted","Data":"6f418e40dfb8f1d6275d1dd4d7c76759014e77d9f089c684aabc2b832cbc16a4"} Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.313959 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp"] Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.319784 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" event={"ID":"20d8f0e3-8406-4e55-adbf-0681e090a82e","Type":"ContainerStarted","Data":"9a75e0e5683cabcd19266859016ceffd5f8a921a8399ca823b487c4456fa127b"} Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.325748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" event={"ID":"1e77763b-639f-46aa-a798-e39251aa8636","Type":"ContainerStarted","Data":"961583d76d2afe4d6d9192e516695bd95703a894c2d08c5cc5aa076602e9df3a"} Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.332586 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" event={"ID":"4e44f450-61c5-4f49-b16b-8c9e0f060879","Type":"ContainerStarted","Data":"7f303fdbd552a1ffd129ff71d764c880b4681e08243f25c96e91c7777173b657"} Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.337336 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vp5dt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-ctr5d_openstack-operators(3f494774-a168-4199-bfff-e73f64a669cf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.341710 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d"] Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.342279 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vp5dt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-ctr5d_openstack-operators(3f494774-a168-4199-bfff-e73f64a669cf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.343492 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" podUID="3f494774-a168-4199-bfff-e73f64a669cf" Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.348866 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" event={"ID":"c0ed7161-2907-48a6-894d-c6e3a1f47e0e","Type":"ContainerStarted","Data":"5da273e155aaf759feb5f7821a66e8f3b29798054db153a15d5b2a44c672cc2c"} Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.350617 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" event={"ID":"0ee4b811-c59a-4120-bf78-53fe9e049d4b","Type":"ContainerStarted","Data":"69eb7cd6d345b54fdd461adcc746558ee1cdd5417be79977b84caaffc0b30398"} Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.351705 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dszlc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-s4rlp_openstack-operators(6b7d748f-a9e4-416a-8fd7-9fa46ca2060d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.352648 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" event={"ID":"f5580ab2-73e2-4766-8e9c-f217fd4c079d","Type":"ContainerStarted","Data":"68c4da6b910a6ff88093762b40a81b6ea58adad7a9d1023f3f2ebfcbb23d9e86"} Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.352764 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" podUID="6b7d748f-a9e4-416a-8fd7-9fa46ca2060d" Dec 01 09:29:43 crc kubenswrapper[4763]: W1201 09:29:43.360937 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e0bbed_711d_48ca_bf8c_fa678c91c3de.slice/crio-cf51dd2f0ac412aca971b4391030b2b09e90586e1b72a7cb33e45e468b88df70 WatchSource:0}: Error finding container cf51dd2f0ac412aca971b4391030b2b09e90586e1b72a7cb33e45e468b88df70: Status 404 returned error can't find the container with id cf51dd2f0ac412aca971b4391030b2b09e90586e1b72a7cb33e45e468b88df70 Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.487502 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.488516 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.488614 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert podName:2aecf763-7a48-4c6f-a66c-ea391befd47a nodeName:}" failed. No retries permitted until 2025-12-01 09:29:47.48858126 +0000 UTC m=+904.757230028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert") pod "infra-operator-controller-manager-57548d458d-9vjrk" (UID: "2aecf763-7a48-4c6f-a66c-ea391befd47a") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.754865 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hjvw"] Dec 01 09:29:43 crc kubenswrapper[4763]: I1201 09:29:43.799349 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.799586 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:43 crc kubenswrapper[4763]: E1201 09:29:43.799633 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert podName:e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:47.799619552 +0000 UTC m=+905.068268320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" (UID: "e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:44 crc kubenswrapper[4763]: I1201 09:29:44.364175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" event={"ID":"664dabb5-40f4-44c4-be9d-1870e153c877","Type":"ContainerStarted","Data":"88bb0a59ba35cd31c5f38995c6769d0704305f24e898926c695045a4465ef46b"} Dec 01 09:29:44 crc kubenswrapper[4763]: I1201 09:29:44.367306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" event={"ID":"89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0","Type":"ContainerStarted","Data":"0b7c2e7f4cbf692a8c0a62755ecf990871cdc4b99b11aa62b8fc65a04d873365"} Dec 01 09:29:44 crc kubenswrapper[4763]: I1201 09:29:44.368440 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" event={"ID":"3f494774-a168-4199-bfff-e73f64a669cf","Type":"ContainerStarted","Data":"8b128fc8326998c48b522520c1b3cf683c6d483b79d679711108d43893076a71"} Dec 01 09:29:44 crc kubenswrapper[4763]: E1201 09:29:44.372902 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" podUID="3f494774-a168-4199-bfff-e73f64a669cf" Dec 01 09:29:44 crc kubenswrapper[4763]: I1201 09:29:44.375633 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" event={"ID":"6b7d748f-a9e4-416a-8fd7-9fa46ca2060d","Type":"ContainerStarted","Data":"1c8c3f87950b38b4c4a1dda7ca7d16157ecd3f500ff83c99163e328c88cec867"} Dec 01 09:29:44 crc kubenswrapper[4763]: E1201 09:29:44.377422 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" podUID="6b7d748f-a9e4-416a-8fd7-9fa46ca2060d" Dec 01 09:29:44 crc kubenswrapper[4763]: I1201 09:29:44.377995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmb2k" event={"ID":"96e0bbed-711d-48ca-bf8c-fa678c91c3de","Type":"ContainerStarted","Data":"cf51dd2f0ac412aca971b4391030b2b09e90586e1b72a7cb33e45e468b88df70"} Dec 01 09:29:44 crc kubenswrapper[4763]: I1201 09:29:44.379422 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hjvw" event={"ID":"c11aee5c-47cc-4313-9380-0146961c94b8","Type":"ContainerStarted","Data":"582da9d42373f0559d839ad39b58cce8fd5719eeef9ae3393af935a8881e1cda"} Dec 01 09:29:44 crc kubenswrapper[4763]: I1201 09:29:44.714024 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:44 crc kubenswrapper[4763]: I1201 09:29:44.714120 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:44 crc kubenswrapper[4763]: E1201 09:29:44.714344 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:29:44 crc kubenswrapper[4763]: E1201 09:29:44.714413 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:48.714391934 +0000 UTC m=+905.983040712 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "webhook-server-cert" not found Dec 01 09:29:44 crc kubenswrapper[4763]: E1201 09:29:44.714798 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:29:44 crc kubenswrapper[4763]: E1201 09:29:44.714880 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:48.714862278 +0000 UTC m=+905.983511056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "metrics-server-cert" not found Dec 01 09:29:45 crc kubenswrapper[4763]: I1201 09:29:45.174156 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:45 crc kubenswrapper[4763]: I1201 09:29:45.174216 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:45 crc kubenswrapper[4763]: I1201 09:29:45.245213 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:45 crc kubenswrapper[4763]: I1201 09:29:45.403367 4763 generic.go:334] "Generic (PLEG): container finished" podID="c11aee5c-47cc-4313-9380-0146961c94b8" containerID="aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77" exitCode=0 Dec 01 09:29:45 crc kubenswrapper[4763]: I1201 09:29:45.403429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hjvw" event={"ID":"c11aee5c-47cc-4313-9380-0146961c94b8","Type":"ContainerDied","Data":"aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77"} Dec 01 09:29:45 crc kubenswrapper[4763]: I1201 09:29:45.406446 4763 generic.go:334] "Generic (PLEG): container finished" podID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerID="b7543c0d2746dc176b4e6426b0790d34897675b5378c7a12473fd9c86d444347" exitCode=0 Dec 01 09:29:45 crc kubenswrapper[4763]: I1201 09:29:45.407423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmb2k" event={"ID":"96e0bbed-711d-48ca-bf8c-fa678c91c3de","Type":"ContainerDied","Data":"b7543c0d2746dc176b4e6426b0790d34897675b5378c7a12473fd9c86d444347"} Dec 01 09:29:45 crc kubenswrapper[4763]: E1201 09:29:45.409049 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" podUID="6b7d748f-a9e4-416a-8fd7-9fa46ca2060d" Dec 01 09:29:45 crc kubenswrapper[4763]: E1201 09:29:45.409426 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" podUID="3f494774-a168-4199-bfff-e73f64a669cf" Dec 01 09:29:45 crc kubenswrapper[4763]: I1201 09:29:45.509806 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:47 crc kubenswrapper[4763]: I1201 09:29:47.536959 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:47 crc kubenswrapper[4763]: E1201 09:29:47.537222 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:47 crc kubenswrapper[4763]: E1201 09:29:47.537395 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert podName:2aecf763-7a48-4c6f-a66c-ea391befd47a nodeName:}" failed. No retries permitted until 2025-12-01 09:29:55.537377247 +0000 UTC m=+912.806026015 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert") pod "infra-operator-controller-manager-57548d458d-9vjrk" (UID: "2aecf763-7a48-4c6f-a66c-ea391befd47a") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:47 crc kubenswrapper[4763]: I1201 09:29:47.850754 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:47 crc kubenswrapper[4763]: E1201 09:29:47.851010 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:47 crc kubenswrapper[4763]: E1201 09:29:47.851067 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert podName:e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:55.851049405 +0000 UTC m=+913.119698183 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" (UID: "e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:48 crc kubenswrapper[4763]: I1201 09:29:48.183166 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwwr5"] Dec 01 09:29:48 crc kubenswrapper[4763]: I1201 09:29:48.183471 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wwwr5" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="registry-server" containerID="cri-o://25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9" gracePeriod=2 Dec 01 09:29:48 crc kubenswrapper[4763]: I1201 09:29:48.438918 4763 generic.go:334] "Generic (PLEG): container finished" podID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerID="25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9" exitCode=0 Dec 01 09:29:48 crc kubenswrapper[4763]: I1201 09:29:48.438959 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwwr5" event={"ID":"b6138a92-1fe1-48a0-8178-210dfe3ea1d0","Type":"ContainerDied","Data":"25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9"} Dec 01 09:29:48 crc kubenswrapper[4763]: I1201 09:29:48.769399 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:48 crc kubenswrapper[4763]: I1201 09:29:48.769475 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:48 crc kubenswrapper[4763]: E1201 09:29:48.769592 4763 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 01 09:29:48 crc kubenswrapper[4763]: E1201 09:29:48.769648 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:56.769626219 +0000 UTC m=+914.038274987 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "webhook-server-cert" not found Dec 01 09:29:48 crc kubenswrapper[4763]: E1201 09:29:48.769966 4763 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 01 09:29:48 crc kubenswrapper[4763]: E1201 09:29:48.769993 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs podName:dd971a72-ce63-45cb-9457-43fcea25f677 nodeName:}" failed. No retries permitted until 2025-12-01 09:29:56.769984969 +0000 UTC m=+914.038633737 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs") pod "openstack-operator-controller-manager-6d555457c4-jcpzh" (UID: "dd971a72-ce63-45cb-9457-43fcea25f677") : secret "metrics-server-cert" not found Dec 01 09:29:55 crc kubenswrapper[4763]: E1201 09:29:55.175800 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9 is running failed: container process not found" containerID="25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:29:55 crc kubenswrapper[4763]: E1201 09:29:55.176854 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9 is running failed: container process not found" containerID="25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:29:55 crc kubenswrapper[4763]: E1201 09:29:55.177338 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9 is running failed: container process not found" containerID="25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9" cmd=["grpc_health_probe","-addr=:50051"] Dec 01 09:29:55 crc kubenswrapper[4763]: E1201 09:29:55.177396 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-wwwr5" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="registry-server" Dec 01 09:29:55 crc kubenswrapper[4763]: I1201 09:29:55.544633 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:29:55 crc kubenswrapper[4763]: E1201 09:29:55.544878 4763 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:55 crc kubenswrapper[4763]: E1201 09:29:55.544972 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert podName:2aecf763-7a48-4c6f-a66c-ea391befd47a nodeName:}" failed. No retries permitted until 2025-12-01 09:30:11.54494841 +0000 UTC m=+928.813597228 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert") pod "infra-operator-controller-manager-57548d458d-9vjrk" (UID: "2aecf763-7a48-4c6f-a66c-ea391befd47a") : secret "infra-operator-webhook-server-cert" not found Dec 01 09:29:55 crc kubenswrapper[4763]: I1201 09:29:55.950827 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:29:55 crc kubenswrapper[4763]: E1201 09:29:55.951026 4763 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:55 crc kubenswrapper[4763]: E1201 09:29:55.951295 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert podName:e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0 nodeName:}" failed. No retries permitted until 2025-12-01 09:30:11.951276318 +0000 UTC m=+929.219925076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" (UID: "e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 01 09:29:56 crc kubenswrapper[4763]: I1201 09:29:56.862303 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:56 crc kubenswrapper[4763]: I1201 09:29:56.862420 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:56 crc kubenswrapper[4763]: I1201 09:29:56.868551 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-webhook-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:56 crc kubenswrapper[4763]: I1201 09:29:56.876654 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd971a72-ce63-45cb-9457-43fcea25f677-metrics-certs\") pod \"openstack-operator-controller-manager-6d555457c4-jcpzh\" (UID: \"dd971a72-ce63-45cb-9457-43fcea25f677\") " pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.102034 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-htdmn" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.110773 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:29:57 crc kubenswrapper[4763]: E1201 09:29:57.173983 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 01 09:29:57 crc kubenswrapper[4763]: E1201 09:29:57.174173 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v9s58,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-l547d_openstack-operators(ee716572-1b36-4216-84a6-ad3f4ac2b7f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.240417 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.368683 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-utilities\") pod \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.369108 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-catalog-content\") pod \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.369391 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr9sj\" (UniqueName: \"kubernetes.io/projected/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-kube-api-access-mr9sj\") pod \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\" (UID: \"b6138a92-1fe1-48a0-8178-210dfe3ea1d0\") " Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.370006 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-utilities" (OuterVolumeSpecName: "utilities") pod "b6138a92-1fe1-48a0-8178-210dfe3ea1d0" (UID: "b6138a92-1fe1-48a0-8178-210dfe3ea1d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.370770 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.379432 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-kube-api-access-mr9sj" (OuterVolumeSpecName: "kube-api-access-mr9sj") pod "b6138a92-1fe1-48a0-8178-210dfe3ea1d0" (UID: "b6138a92-1fe1-48a0-8178-210dfe3ea1d0"). InnerVolumeSpecName "kube-api-access-mr9sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.389010 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6138a92-1fe1-48a0-8178-210dfe3ea1d0" (UID: "b6138a92-1fe1-48a0-8178-210dfe3ea1d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.474466 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.474496 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr9sj\" (UniqueName: \"kubernetes.io/projected/b6138a92-1fe1-48a0-8178-210dfe3ea1d0-kube-api-access-mr9sj\") on node \"crc\" DevicePath \"\"" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.524106 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wwwr5" event={"ID":"b6138a92-1fe1-48a0-8178-210dfe3ea1d0","Type":"ContainerDied","Data":"eea3c738dbafec928c7d22fb6af5311ae29d6d7a83631d6f95873471faed09c6"} Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.524160 4763 scope.go:117] "RemoveContainer" containerID="25f387c4f85981bfedcd20bac691323ba425d2451684e615d84e35c8913ea4f9" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.524207 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wwwr5" Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.567175 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwwr5"] Dec 01 09:29:57 crc kubenswrapper[4763]: I1201 09:29:57.577990 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wwwr5"] Dec 01 09:29:59 crc kubenswrapper[4763]: I1201 09:29:59.004482 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" path="/var/lib/kubelet/pods/b6138a92-1fe1-48a0-8178-210dfe3ea1d0/volumes" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.164588 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m"] Dec 01 09:30:00 crc kubenswrapper[4763]: E1201 09:30:00.164969 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="extract-content" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.164987 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="extract-content" Dec 01 09:30:00 crc kubenswrapper[4763]: E1201 09:30:00.164997 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="extract-utilities" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.165007 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="extract-utilities" Dec 01 09:30:00 crc kubenswrapper[4763]: E1201 09:30:00.165017 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="registry-server" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.165026 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="registry-server" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.165191 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6138a92-1fe1-48a0-8178-210dfe3ea1d0" containerName="registry-server" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.165865 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.174172 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.175152 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.177750 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m"] Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.332183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqc5\" (UniqueName: \"kubernetes.io/projected/5aa0ec16-0f6b-4b6b-894c-31c949e95498-kube-api-access-wnqc5\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.332557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aa0ec16-0f6b-4b6b-894c-31c949e95498-config-volume\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.332625 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aa0ec16-0f6b-4b6b-894c-31c949e95498-secret-volume\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.433659 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqc5\" (UniqueName: \"kubernetes.io/projected/5aa0ec16-0f6b-4b6b-894c-31c949e95498-kube-api-access-wnqc5\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.433728 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aa0ec16-0f6b-4b6b-894c-31c949e95498-config-volume\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.433781 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aa0ec16-0f6b-4b6b-894c-31c949e95498-secret-volume\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.434754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aa0ec16-0f6b-4b6b-894c-31c949e95498-config-volume\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.437978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aa0ec16-0f6b-4b6b-894c-31c949e95498-secret-volume\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.458541 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqc5\" (UniqueName: \"kubernetes.io/projected/5aa0ec16-0f6b-4b6b-894c-31c949e95498-kube-api-access-wnqc5\") pod \"collect-profiles-29409690-drv5m\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:00 crc kubenswrapper[4763]: I1201 09:30:00.484321 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:06 crc kubenswrapper[4763]: E1201 09:30:06.116857 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 01 09:30:06 crc kubenswrapper[4763]: E1201 09:30:06.117683 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l9sxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-nb5kc_openstack-operators(9eef10a1-bfcc-412c-9687-fee23d90d448): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:07 crc kubenswrapper[4763]: E1201 09:30:07.533680 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 01 09:30:07 crc kubenswrapper[4763]: E1201 09:30:07.534663 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nskj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-tk5xf_openstack-operators(13a2ac2b-0374-4da0-abbf-6aecbc3afbb8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:09 crc kubenswrapper[4763]: E1201 09:30:09.780747 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 01 09:30:09 crc kubenswrapper[4763]: E1201 09:30:09.781206 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rlbqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-4srbr_openstack-operators(c0ed7161-2907-48a6-894d-c6e3a1f47e0e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:11 crc kubenswrapper[4763]: E1201 09:30:11.328842 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 01 09:30:11 crc kubenswrapper[4763]: E1201 09:30:11.329036 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gcgkn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-9f5r9_openstack-operators(a0713966-4e10-4b8b-84bc-6560d1b1bf5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:11 crc kubenswrapper[4763]: I1201 09:30:11.624013 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:30:11 crc kubenswrapper[4763]: I1201 09:30:11.629193 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2aecf763-7a48-4c6f-a66c-ea391befd47a-cert\") pod \"infra-operator-controller-manager-57548d458d-9vjrk\" (UID: \"2aecf763-7a48-4c6f-a66c-ea391befd47a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:30:11 crc kubenswrapper[4763]: I1201 09:30:11.657243 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hv47l" Dec 01 09:30:11 crc kubenswrapper[4763]: I1201 09:30:11.665493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:30:11 crc kubenswrapper[4763]: E1201 09:30:11.797214 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:ecf7be921850bdc04697ed1b332bab39ad2a64e4e45c2a445c04f9bae6ac61b5" Dec 01 09:30:11 crc kubenswrapper[4763]: E1201 09:30:11.797393 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:ecf7be921850bdc04697ed1b332bab39ad2a64e4e45c2a445c04f9bae6ac61b5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bjdcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-6546668bfd-9x2q4_openstack-operators(4e44f450-61c5-4f49-b16b-8c9e0f060879): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:12 crc kubenswrapper[4763]: I1201 09:30:12.028172 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:30:12 crc kubenswrapper[4763]: I1201 09:30:12.037701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj\" (UID: \"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:30:12 crc kubenswrapper[4763]: I1201 09:30:12.302469 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4ltnf" Dec 01 09:30:12 crc kubenswrapper[4763]: I1201 09:30:12.312957 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:30:13 crc kubenswrapper[4763]: E1201 09:30:13.855060 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 01 09:30:13 crc kubenswrapper[4763]: E1201 09:30:13.855629 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qkq5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-64k9k_openstack-operators(664dabb5-40f4-44c4-be9d-1870e153c877): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:16 crc kubenswrapper[4763]: E1201 09:30:16.475382 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Dec 01 09:30:16 crc kubenswrapper[4763]: E1201 09:30:16.475954 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qswkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-2skdl_openstack-operators(89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:16 crc kubenswrapper[4763]: E1201 09:30:16.979358 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 01 09:30:16 crc kubenswrapper[4763]: E1201 09:30:16.979571 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ptgbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-jbvjc_openstack-operators(ddef7d32-1d4c-496d-be36-7ae7af64205a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:17 crc kubenswrapper[4763]: E1201 09:30:17.521191 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 01 09:30:17 crc kubenswrapper[4763]: E1201 09:30:17.521417 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwwm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-dbhs6_openstack-operators(1e77763b-639f-46aa-a798-e39251aa8636): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:21 crc kubenswrapper[4763]: E1201 09:30:21.158907 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 01 09:30:21 crc kubenswrapper[4763]: E1201 09:30:21.159364 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jcr66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-mqvfc_openstack-operators(20d8f0e3-8406-4e55-adbf-0681e090a82e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:23 crc kubenswrapper[4763]: E1201 09:30:23.404022 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621" Dec 01 09:30:23 crc kubenswrapper[4763]: E1201 09:30:23.404673 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vp5dt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-ctr5d_openstack-operators(3f494774-a168-4199-bfff-e73f64a669cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:27 crc kubenswrapper[4763]: E1201 09:30:27.479615 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 01 09:30:27 crc kubenswrapper[4763]: E1201 09:30:27.480107 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vr6bt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7hjvw_openshift-marketplace(c11aee5c-47cc-4313-9380-0146961c94b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 01 09:30:27 crc kubenswrapper[4763]: E1201 09:30:27.481307 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7hjvw" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" Dec 01 09:30:31 crc kubenswrapper[4763]: E1201 09:30:31.204845 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7hjvw" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" Dec 01 09:30:31 crc kubenswrapper[4763]: E1201 09:30:31.228064 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3" Dec 01 09:30:31 crc kubenswrapper[4763]: E1201 09:30:31.228252 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rl6k9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-t74fr_openstack-operators(f5580ab2-73e2-4766-8e9c-f217fd4c079d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:32 crc kubenswrapper[4763]: I1201 09:30:32.940864 4763 scope.go:117] "RemoveContainer" containerID="1f302ab97c731d6ca027be40a4d510bad87a0b829dd8e525ce2e4b4091292212" Dec 01 09:30:32 crc kubenswrapper[4763]: E1201 09:30:32.965219 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 01 09:30:32 crc kubenswrapper[4763]: E1201 09:30:32.965433 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8bptt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-jxvhd_openstack-operators(0ee4b811-c59a-4120-bf78-53fe9e049d4b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:32 crc kubenswrapper[4763]: I1201 09:30:32.968861 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.880222 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.881003 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dszlc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-s4rlp_openstack-operators(6b7d748f-a9e4-416a-8fd7-9fa46ca2060d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.882239 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" podUID="6b7d748f-a9e4-416a-8fd7-9fa46ca2060d" Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.964823 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.966972 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ptgbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-jbvjc_openstack-operators(ddef7d32-1d4c-496d-be36-7ae7af64205a): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.969752 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" podUID="ddef7d32-1d4c-496d-be36-7ae7af64205a" Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.989727 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.989915 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwwm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-dbhs6_openstack-operators(1e77763b-639f-46aa-a798-e39251aa8636): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 01 09:30:34 crc kubenswrapper[4763]: E1201 09:30:34.991157 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" podUID="1e77763b-639f-46aa-a798-e39251aa8636" Dec 01 09:30:35 crc kubenswrapper[4763]: I1201 09:30:35.237082 4763 scope.go:117] "RemoveContainer" containerID="762c98ec1c325d1e7669471771cc45c98d664048b14c70ba7dcd5e518d8fd19a" Dec 01 09:30:35 crc kubenswrapper[4763]: I1201 09:30:35.283236 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m"] Dec 01 09:30:35 crc kubenswrapper[4763]: I1201 09:30:35.542666 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh"] Dec 01 09:30:35 crc kubenswrapper[4763]: I1201 09:30:35.760436 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk"] Dec 01 09:30:35 crc kubenswrapper[4763]: I1201 09:30:35.780595 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj"] Dec 01 09:30:36 crc kubenswrapper[4763]: W1201 09:30:36.005527 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b6a6e4_ad2c_4f39_bf7f_777e5ddb62d0.slice/crio-bb0507b8fb11b4c34aabef35084f39688174f535226bb6df45948c39f4dec1c3 WatchSource:0}: Error finding container bb0507b8fb11b4c34aabef35084f39688174f535226bb6df45948c39f4dec1c3: Status 404 returned error can't find the container with id bb0507b8fb11b4c34aabef35084f39688174f535226bb6df45948c39f4dec1c3 Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.834382 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" event={"ID":"dd971a72-ce63-45cb-9457-43fcea25f677","Type":"ContainerStarted","Data":"69743b131a275b641794553a73521da363a7c3f956301607c89e04813486c03c"} Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.837625 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" event={"ID":"eae2e950-9f81-49cc-926e-380b81a0f0e7","Type":"ContainerStarted","Data":"679970406e14aa0224080cf05848b4df97e5fe6e12209846c8f93962616a5af9"} Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.840151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" event={"ID":"1c980f4b-c55c-4a2e-9461-9f89ec0165c3","Type":"ContainerStarted","Data":"4f60b326f2b7cb7a2a496df784fe5bd4d68daba0294ff00c59efc2e8b1b6e5f5"} Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.841939 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" event={"ID":"c274cd4c-0c77-485c-8d8f-116a2f7b013b","Type":"ContainerStarted","Data":"e5b434118d842f092ebcfa266a9199fbc263ff008dd7ecda7ece49e2dbdcff59"} Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.843679 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" event={"ID":"5aa0ec16-0f6b-4b6b-894c-31c949e95498","Type":"ContainerStarted","Data":"bd74182a817d99642e0474e04aa41486f969dcb34fefaf4dd0d316b103148833"} Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.846747 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" event={"ID":"1208b653-3551-4266-99b9-e83fb86b4771","Type":"ContainerStarted","Data":"38e96ea264dd4cf9bd1d0e6243d10d1b0de6fb1ddfdaa5f638754dd7a530c7d2"} Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.850530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" event={"ID":"2aecf763-7a48-4c6f-a66c-ea391befd47a","Type":"ContainerStarted","Data":"e6730949602604a9a4f84cb4b20e833d02b31f90a75bba09543b7568edfd7d7b"} Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.852520 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" event={"ID":"5325eff2-4650-499e-9cad-f486bae74fce","Type":"ContainerStarted","Data":"3cec3401c008743173e30423bb44aa0430c19082f6a9d8d64b239df3cf4a4162"} Dec 01 09:30:36 crc kubenswrapper[4763]: I1201 09:30:36.854335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" event={"ID":"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0","Type":"ContainerStarted","Data":"bb0507b8fb11b4c34aabef35084f39688174f535226bb6df45948c39f4dec1c3"} Dec 01 09:30:38 crc kubenswrapper[4763]: I1201 09:30:38.877385 4763 generic.go:334] "Generic (PLEG): container finished" podID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerID="bef500e45be5aad41eb62c6bf57a86c434a3f5ea2d784bad5384e9821afe9292" exitCode=0 Dec 01 09:30:38 crc kubenswrapper[4763]: I1201 09:30:38.877433 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmb2k" event={"ID":"96e0bbed-711d-48ca-bf8c-fa678c91c3de","Type":"ContainerDied","Data":"bef500e45be5aad41eb62c6bf57a86c434a3f5ea2d784bad5384e9821afe9292"} Dec 01 09:30:38 crc kubenswrapper[4763]: I1201 09:30:38.881529 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" event={"ID":"5aa0ec16-0f6b-4b6b-894c-31c949e95498","Type":"ContainerStarted","Data":"c46e6b46de1606f5513d6221c403d4c8ef18650f577951ebee6572d3046aa5a0"} Dec 01 09:30:38 crc kubenswrapper[4763]: I1201 09:30:38.916417 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" podStartSLOduration=38.916397701 podStartE2EDuration="38.916397701s" podCreationTimestamp="2025-12-01 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:38.910281041 +0000 UTC m=+956.178929809" watchObservedRunningTime="2025-12-01 09:30:38.916397701 +0000 UTC m=+956.185046469" Dec 01 09:30:39 crc kubenswrapper[4763]: I1201 09:30:39.910750 4763 generic.go:334] "Generic (PLEG): container finished" podID="5aa0ec16-0f6b-4b6b-894c-31c949e95498" containerID="c46e6b46de1606f5513d6221c403d4c8ef18650f577951ebee6572d3046aa5a0" exitCode=0 Dec 01 09:30:39 crc kubenswrapper[4763]: I1201 09:30:39.910929 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" event={"ID":"5aa0ec16-0f6b-4b6b-894c-31c949e95498","Type":"ContainerDied","Data":"c46e6b46de1606f5513d6221c403d4c8ef18650f577951ebee6572d3046aa5a0"} Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.147749 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" podUID="4e44f450-61c5-4f49-b16b-8c9e0f060879" Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.342685 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" podUID="89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0" Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.499885 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" podUID="f5580ab2-73e2-4766-8e9c-f217fd4c079d" Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.756558 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" podUID="13a2ac2b-0374-4da0-abbf-6aecbc3afbb8" Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.769259 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" podUID="664dabb5-40f4-44c4-be9d-1870e153c877" Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.932998 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.933169 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l9sxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-nb5kc_openstack-operators(9eef10a1-bfcc-412c-9687-fee23d90d448): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.936607 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" podUID="9eef10a1-bfcc-412c-9687-fee23d90d448" Dec 01 09:30:40 crc kubenswrapper[4763]: I1201 09:30:40.948705 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" event={"ID":"5325eff2-4650-499e-9cad-f486bae74fce","Type":"ContainerStarted","Data":"4409457c2ee249b9fff89342c6c21021c290c4588bac0dcaccb7e2829f856f29"} Dec 01 09:30:40 crc kubenswrapper[4763]: I1201 09:30:40.949832 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" Dec 01 09:30:40 crc kubenswrapper[4763]: I1201 09:30:40.957127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" event={"ID":"f5580ab2-73e2-4766-8e9c-f217fd4c079d","Type":"ContainerStarted","Data":"3bf6261c678dcd3e0346a85fc205fce9156fa5a8b00105eaacf68edc971fbf17"} Dec 01 09:30:40 crc kubenswrapper[4763]: E1201 09:30:40.963587 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:986861e5a0a9954f63581d9d55a30f8057883cefea489415d76257774526eea3\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" podUID="f5580ab2-73e2-4766-8e9c-f217fd4c079d" Dec 01 09:30:40 crc kubenswrapper[4763]: I1201 09:30:40.970947 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" event={"ID":"1c980f4b-c55c-4a2e-9461-9f89ec0165c3","Type":"ContainerStarted","Data":"a09c497c3a82b519fcc6aa6d99594df2516c76e75ae077193a67765341c0d408"} Dec 01 09:30:40 crc kubenswrapper[4763]: I1201 09:30:40.971132 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" Dec 01 09:30:40 crc kubenswrapper[4763]: I1201 09:30:40.978674 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" podStartSLOduration=3.8027048839999997 podStartE2EDuration="1m1.978653155s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:41.927008286 +0000 UTC m=+899.195657054" lastFinishedPulling="2025-12-01 09:30:40.102956557 +0000 UTC m=+957.371605325" observedRunningTime="2025-12-01 09:30:40.971087695 +0000 UTC m=+958.239736463" watchObservedRunningTime="2025-12-01 09:30:40.978653155 +0000 UTC m=+958.247301923" Dec 01 09:30:40 crc kubenswrapper[4763]: I1201 09:30:40.986849 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" event={"ID":"dd971a72-ce63-45cb-9457-43fcea25f677","Type":"ContainerStarted","Data":"2cd3ee58c8eb7e4eb5ae5c4cf3e826b50474393bded3fe080ff47df6636decf9"} Dec 01 09:30:40 crc kubenswrapper[4763]: I1201 09:30:40.987117 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.001631 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" podStartSLOduration=3.021029496 podStartE2EDuration="1m2.001618599s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:41.615062798 +0000 UTC m=+898.883711566" lastFinishedPulling="2025-12-01 09:30:40.595651901 +0000 UTC m=+957.864300669" observedRunningTime="2025-12-01 09:30:40.998556574 +0000 UTC m=+958.267205342" watchObservedRunningTime="2025-12-01 09:30:41.001618599 +0000 UTC m=+958.270267367" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.015811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" event={"ID":"4e44f450-61c5-4f49-b16b-8c9e0f060879","Type":"ContainerStarted","Data":"8e22c1fec4b67f4c853e3e3fd8ebde6087c6e88392dd71e1da2fa56080520dd6"} Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.022722 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" event={"ID":"664dabb5-40f4-44c4-be9d-1870e153c877","Type":"ContainerStarted","Data":"0300213e58bb7918eedf3f18fc74ec6307ca47222b19391e9f4be6a0c84a7f1f"} Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.060926 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" event={"ID":"eae2e950-9f81-49cc-926e-380b81a0f0e7","Type":"ContainerStarted","Data":"4a9f6888ba33930b7e66d998455f26c76b25f81ebf225988f4001f082f069077"} Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.060984 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.065674 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.067026 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" event={"ID":"13a2ac2b-0374-4da0-abbf-6aecbc3afbb8","Type":"ContainerStarted","Data":"f17cae76e4fb1f7c977485d298e3b42faa4d3f86ac97612e57b7d3fbf7771b45"} Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.079022 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" event={"ID":"c274cd4c-0c77-485c-8d8f-116a2f7b013b","Type":"ContainerStarted","Data":"ea1aa168d38d3da121894ada053a9793598ed32b13354f4c914c3461afbfe65c"} Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.080220 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.088016 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.096120 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" event={"ID":"89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0","Type":"ContainerStarted","Data":"6ee0d4776acb726c3b9870a92739c0c7afc10b5988ef235e43357f7475e2979a"} Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.143104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" event={"ID":"1208b653-3551-4266-99b9-e83fb86b4771","Type":"ContainerStarted","Data":"1997634f662b9deb11aed3c0e51a1e454462aad7de6f429b63a1588fa959d883"} Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.143325 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.147046 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.148763 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" podStartSLOduration=61.148749954 podStartE2EDuration="1m1.148749954s" podCreationTimestamp="2025-12-01 09:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:30:41.104200953 +0000 UTC m=+958.372849721" watchObservedRunningTime="2025-12-01 09:30:41.148749954 +0000 UTC m=+958.417398722" Dec 01 09:30:41 crc kubenswrapper[4763]: E1201 09:30:41.167350 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" podUID="c0ed7161-2907-48a6-894d-c6e3a1f47e0e" Dec 01 09:30:41 crc kubenswrapper[4763]: E1201 09:30:41.198801 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" podUID="0ee4b811-c59a-4120-bf78-53fe9e049d4b" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.199898 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-qh7dr" podStartSLOduration=3.556525502 podStartE2EDuration="1m2.199885878s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:41.614864502 +0000 UTC m=+898.883513270" lastFinishedPulling="2025-12-01 09:30:40.258224888 +0000 UTC m=+957.526873646" observedRunningTime="2025-12-01 09:30:41.199107056 +0000 UTC m=+958.467755824" watchObservedRunningTime="2025-12-01 09:30:41.199885878 +0000 UTC m=+958.468534646" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.242223 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-rjzdx" podStartSLOduration=4.011919527 podStartE2EDuration="1m2.242202077s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.049261356 +0000 UTC m=+899.317910124" lastFinishedPulling="2025-12-01 09:30:40.279543906 +0000 UTC m=+957.548192674" observedRunningTime="2025-12-01 09:30:41.239812351 +0000 UTC m=+958.508461129" watchObservedRunningTime="2025-12-01 09:30:41.242202077 +0000 UTC m=+958.510850855" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.345895 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-k2nzk" podStartSLOduration=4.960734312 podStartE2EDuration="1m2.345879292s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.574349393 +0000 UTC m=+899.842998161" lastFinishedPulling="2025-12-01 09:30:39.959494373 +0000 UTC m=+957.228143141" observedRunningTime="2025-12-01 09:30:41.342759876 +0000 UTC m=+958.611408644" watchObservedRunningTime="2025-12-01 09:30:41.345879292 +0000 UTC m=+958.614528060" Dec 01 09:30:41 crc kubenswrapper[4763]: E1201 09:30:41.583919 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" podUID="ee716572-1b36-4216-84a6-ad3f4ac2b7f6" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.864154 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.963001 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aa0ec16-0f6b-4b6b-894c-31c949e95498-config-volume\") pod \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.963049 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aa0ec16-0f6b-4b6b-894c-31c949e95498-secret-volume\") pod \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.963117 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnqc5\" (UniqueName: \"kubernetes.io/projected/5aa0ec16-0f6b-4b6b-894c-31c949e95498-kube-api-access-wnqc5\") pod \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\" (UID: \"5aa0ec16-0f6b-4b6b-894c-31c949e95498\") " Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.965613 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aa0ec16-0f6b-4b6b-894c-31c949e95498-config-volume" (OuterVolumeSpecName: "config-volume") pod "5aa0ec16-0f6b-4b6b-894c-31c949e95498" (UID: "5aa0ec16-0f6b-4b6b-894c-31c949e95498"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.980977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aa0ec16-0f6b-4b6b-894c-31c949e95498-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5aa0ec16-0f6b-4b6b-894c-31c949e95498" (UID: "5aa0ec16-0f6b-4b6b-894c-31c949e95498"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:30:41 crc kubenswrapper[4763]: I1201 09:30:41.988743 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa0ec16-0f6b-4b6b-894c-31c949e95498-kube-api-access-wnqc5" (OuterVolumeSpecName: "kube-api-access-wnqc5") pod "5aa0ec16-0f6b-4b6b-894c-31c949e95498" (UID: "5aa0ec16-0f6b-4b6b-894c-31c949e95498"). InnerVolumeSpecName "kube-api-access-wnqc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:30:42 crc kubenswrapper[4763]: E1201 09:30:42.009888 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" podUID="20d8f0e3-8406-4e55-adbf-0681e090a82e" Dec 01 09:30:42 crc kubenswrapper[4763]: E1201 09:30:42.040182 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" podUID="a0713966-4e10-4b8b-84bc-6560d1b1bf5a" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.065188 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aa0ec16-0f6b-4b6b-894c-31c949e95498-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.065223 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5aa0ec16-0f6b-4b6b-894c-31c949e95498-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.065233 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnqc5\" (UniqueName: \"kubernetes.io/projected/5aa0ec16-0f6b-4b6b-894c-31c949e95498-kube-api-access-wnqc5\") on node \"crc\" DevicePath \"\"" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.196976 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" event={"ID":"0ee4b811-c59a-4120-bf78-53fe9e049d4b","Type":"ContainerStarted","Data":"6d21341287d478905626b2fd6ae86accf7a5a56c2bc94334db508213dd8a3b46"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.215958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" event={"ID":"ddef7d32-1d4c-496d-be36-7ae7af64205a","Type":"ContainerStarted","Data":"93b4b03bddf2b10b818fac0f6903cae067903bf1b0524c6c4138c8a53f0bc5d4"} Dec 01 09:30:42 crc kubenswrapper[4763]: E1201 09:30:42.222236 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" podUID="0ee4b811-c59a-4120-bf78-53fe9e049d4b" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.229378 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" event={"ID":"1e77763b-639f-46aa-a798-e39251aa8636","Type":"ContainerStarted","Data":"25fcf2b31d061457717ff7e3ada2e60b5374025d8b9a54c3e83325ee31a8d989"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.229438 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" event={"ID":"1e77763b-639f-46aa-a798-e39251aa8636","Type":"ContainerStarted","Data":"a2a40252c3db9951a8d14f1191711a3a38a3d2eed42a2fd3225925dd4fabc8d1"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.230418 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.268189 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" event={"ID":"5aa0ec16-0f6b-4b6b-894c-31c949e95498","Type":"ContainerDied","Data":"bd74182a817d99642e0474e04aa41486f969dcb34fefaf4dd0d316b103148833"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.268562 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd74182a817d99642e0474e04aa41486f969dcb34fefaf4dd0d316b103148833" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.268660 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.284308 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" podStartSLOduration=4.967405791 podStartE2EDuration="1m3.284287892s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.556150925 +0000 UTC m=+899.824799693" lastFinishedPulling="2025-12-01 09:30:40.873033026 +0000 UTC m=+958.141681794" observedRunningTime="2025-12-01 09:30:42.282012559 +0000 UTC m=+959.550661327" watchObservedRunningTime="2025-12-01 09:30:42.284287892 +0000 UTC m=+959.552936660" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.306097 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" event={"ID":"c0ed7161-2907-48a6-894d-c6e3a1f47e0e","Type":"ContainerStarted","Data":"2f97f418ac894641143c8312bc72027fefa6ebcd4e30cb84845490acda5b6b23"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.339986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" event={"ID":"20d8f0e3-8406-4e55-adbf-0681e090a82e","Type":"ContainerStarted","Data":"45dc07959b80d00aa68a3808232937cb2a51705e7fc27bcf8ea42103f1d35c12"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.367750 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" event={"ID":"a0713966-4e10-4b8b-84bc-6560d1b1bf5a","Type":"ContainerStarted","Data":"618dd344fc237ed3f9c5e04f63c360cf5c42196ca3747780807c308dc31b2902"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.419813 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmb2k" event={"ID":"96e0bbed-711d-48ca-bf8c-fa678c91c3de","Type":"ContainerStarted","Data":"8bb3123aa991b4fc0ed2520266e9427ad66226e2b52d83218096a15dcd6e2738"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.439806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" event={"ID":"ee716572-1b36-4216-84a6-ad3f4ac2b7f6","Type":"ContainerStarted","Data":"7d854c7daea132e25fc6c50b7feaa116cf1312b8c08cd4d602d63751de2e310d"} Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.445072 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-xx8m2" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.449488 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-nh5mz" Dec 01 09:30:42 crc kubenswrapper[4763]: I1201 09:30:42.476232 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dmb2k" podStartSLOduration=8.404059314 podStartE2EDuration="1m2.476215615s" podCreationTimestamp="2025-12-01 09:29:40 +0000 UTC" firstStartedPulling="2025-12-01 09:29:46.996014607 +0000 UTC m=+904.264663395" lastFinishedPulling="2025-12-01 09:30:41.068170928 +0000 UTC m=+958.336819696" observedRunningTime="2025-12-01 09:30:42.467878424 +0000 UTC m=+959.736527192" watchObservedRunningTime="2025-12-01 09:30:42.476215615 +0000 UTC m=+959.744864383" Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.505885 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" event={"ID":"664dabb5-40f4-44c4-be9d-1870e153c877","Type":"ContainerStarted","Data":"5e54ee34c79ed75f36163fee4e22b27c2bf589bc1757a72fdd1ba4584e163d03"} Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.506237 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.520097 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" event={"ID":"89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0","Type":"ContainerStarted","Data":"773101ed4e3103b4019da37e0b54b424e9ef3469fb2bb1a2fb59f37a6a11baaa"} Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.520756 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.541173 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" event={"ID":"13a2ac2b-0374-4da0-abbf-6aecbc3afbb8","Type":"ContainerStarted","Data":"99982a04a4e7858dde9bc4d59479d43381dc1532c50fa66fae89133106f0beed"} Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.542012 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.555227 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" podStartSLOduration=4.97796986 podStartE2EDuration="1m3.555212959s" podCreationTimestamp="2025-12-01 09:29:40 +0000 UTC" firstStartedPulling="2025-12-01 09:29:43.272583658 +0000 UTC m=+900.541232426" lastFinishedPulling="2025-12-01 09:30:41.849826747 +0000 UTC m=+959.118475525" observedRunningTime="2025-12-01 09:30:43.553829902 +0000 UTC m=+960.822478670" watchObservedRunningTime="2025-12-01 09:30:43.555212959 +0000 UTC m=+960.823861727" Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.570567 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" event={"ID":"ddef7d32-1d4c-496d-be36-7ae7af64205a","Type":"ContainerStarted","Data":"2f67a7753f1206f1d863e1128198aef2053a6b374afc6d0b6f64e86b8f88b104"} Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.625400 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" podStartSLOduration=5.542881823 podStartE2EDuration="1m4.625379178s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:43.070349835 +0000 UTC m=+900.338998603" lastFinishedPulling="2025-12-01 09:30:42.15284719 +0000 UTC m=+959.421495958" observedRunningTime="2025-12-01 09:30:43.592236173 +0000 UTC m=+960.860884941" watchObservedRunningTime="2025-12-01 09:30:43.625379178 +0000 UTC m=+960.894027946" Dec 01 09:30:43 crc kubenswrapper[4763]: I1201 09:30:43.643753 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" podStartSLOduration=4.7746526209999995 podStartE2EDuration="1m3.643734166s" podCreationTimestamp="2025-12-01 09:29:40 +0000 UTC" firstStartedPulling="2025-12-01 09:29:43.270506678 +0000 UTC m=+900.539155486" lastFinishedPulling="2025-12-01 09:30:42.139588263 +0000 UTC m=+959.408237031" observedRunningTime="2025-12-01 09:30:43.637158785 +0000 UTC m=+960.905807553" watchObservedRunningTime="2025-12-01 09:30:43.643734166 +0000 UTC m=+960.912382934" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.585233 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" event={"ID":"4e44f450-61c5-4f49-b16b-8c9e0f060879","Type":"ContainerStarted","Data":"80bfae38550543af8d5aef400a70e1abf81af057127246d8b13beac7f4f472a3"} Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.586526 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.593356 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" event={"ID":"ee716572-1b36-4216-84a6-ad3f4ac2b7f6","Type":"ContainerStarted","Data":"1c4ca2b01baafb27bb0e32632de1e4a650840065882669feb40b73fcca32bf5a"} Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.594007 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.596132 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" event={"ID":"c0ed7161-2907-48a6-894d-c6e3a1f47e0e","Type":"ContainerStarted","Data":"5af9c8b4597e8486b7a81cfcd41e30d7c3c652344a99a5f89dd395c824195af7"} Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.596671 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.610054 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" event={"ID":"20d8f0e3-8406-4e55-adbf-0681e090a82e","Type":"ContainerStarted","Data":"d560290c868bebde0f67bec50b4eece511e092f928727d2251ad8969a42e2145"} Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.610776 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.615924 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" podStartSLOduration=6.96268001 podStartE2EDuration="1m5.615908359s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.567471113 +0000 UTC m=+899.836119881" lastFinishedPulling="2025-12-01 09:30:41.220699462 +0000 UTC m=+958.489348230" observedRunningTime="2025-12-01 09:30:43.723773508 +0000 UTC m=+960.992422276" watchObservedRunningTime="2025-12-01 09:30:44.615908359 +0000 UTC m=+961.884557127" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.632477 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" event={"ID":"a0713966-4e10-4b8b-84bc-6560d1b1bf5a","Type":"ContainerStarted","Data":"c73ec22327e8ddd95ea0cea726dc8d91dbedc0d63b39f195fc5aa98a4a44a14b"} Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.633204 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.640358 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" event={"ID":"f5580ab2-73e2-4766-8e9c-f217fd4c079d","Type":"ContainerStarted","Data":"bbe73c13e6a982555ffaaa1b679cf165a9883eaebb8e15eb53a4e29625a57f8e"} Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.640839 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.641658 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.682874 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" podStartSLOduration=5.222575609 podStartE2EDuration="1m5.682855529s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.639750422 +0000 UTC m=+899.908399190" lastFinishedPulling="2025-12-01 09:30:43.100030342 +0000 UTC m=+960.368679110" observedRunningTime="2025-12-01 09:30:44.622037358 +0000 UTC m=+961.890686126" watchObservedRunningTime="2025-12-01 09:30:44.682855529 +0000 UTC m=+961.951504297" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.683057 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" podStartSLOduration=3.450198285 podStartE2EDuration="1m5.683052395s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:41.614494092 +0000 UTC m=+898.883142860" lastFinishedPulling="2025-12-01 09:30:43.847348202 +0000 UTC m=+961.115996970" observedRunningTime="2025-12-01 09:30:44.670014144 +0000 UTC m=+961.938662912" watchObservedRunningTime="2025-12-01 09:30:44.683052395 +0000 UTC m=+961.951701163" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.715438 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" podStartSLOduration=5.084945636 podStartE2EDuration="1m5.715414619s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:43.215871751 +0000 UTC m=+900.484520519" lastFinishedPulling="2025-12-01 09:30:43.846340734 +0000 UTC m=+961.114989502" observedRunningTime="2025-12-01 09:30:44.70896996 +0000 UTC m=+961.977618738" watchObservedRunningTime="2025-12-01 09:30:44.715414619 +0000 UTC m=+961.984063397" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.733534 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" podStartSLOduration=4.992905362 podStartE2EDuration="1m5.733517629s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.646513418 +0000 UTC m=+899.915162186" lastFinishedPulling="2025-12-01 09:30:43.387125685 +0000 UTC m=+960.655774453" observedRunningTime="2025-12-01 09:30:44.729644562 +0000 UTC m=+961.998293330" watchObservedRunningTime="2025-12-01 09:30:44.733517629 +0000 UTC m=+962.002166397" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.769734 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" podStartSLOduration=5.028871024 podStartE2EDuration="1m5.769715509s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.645046316 +0000 UTC m=+899.913695084" lastFinishedPulling="2025-12-01 09:30:43.385890801 +0000 UTC m=+960.654539569" observedRunningTime="2025-12-01 09:30:44.762876671 +0000 UTC m=+962.031525449" watchObservedRunningTime="2025-12-01 09:30:44.769715509 +0000 UTC m=+962.038364277" Dec 01 09:30:44 crc kubenswrapper[4763]: I1201 09:30:44.809351 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" podStartSLOduration=5.3360914489999995 podStartE2EDuration="1m5.809334694s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.628016991 +0000 UTC m=+899.896665749" lastFinishedPulling="2025-12-01 09:30:43.101260226 +0000 UTC m=+960.369908994" observedRunningTime="2025-12-01 09:30:44.805374995 +0000 UTC m=+962.074023763" watchObservedRunningTime="2025-12-01 09:30:44.809334694 +0000 UTC m=+962.077983462" Dec 01 09:30:45 crc kubenswrapper[4763]: E1201 09:30:45.297100 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" podUID="3f494774-a168-4199-bfff-e73f64a669cf" Dec 01 09:30:45 crc kubenswrapper[4763]: I1201 09:30:45.650619 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" event={"ID":"0ee4b811-c59a-4120-bf78-53fe9e049d4b","Type":"ContainerStarted","Data":"a00f5733749042c2003b84083fc002548a75c731480420e4b6cdea30a207101a"} Dec 01 09:30:45 crc kubenswrapper[4763]: I1201 09:30:45.651621 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" Dec 01 09:30:45 crc kubenswrapper[4763]: I1201 09:30:45.653518 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" event={"ID":"3f494774-a168-4199-bfff-e73f64a669cf","Type":"ContainerStarted","Data":"3325be732c208d91477dc7b0e8c4fe6b6a03b46bebb11a48b6b2d5382d014b9f"} Dec 01 09:30:45 crc kubenswrapper[4763]: I1201 09:30:45.733606 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" podStartSLOduration=4.812320748 podStartE2EDuration="1m6.733584883s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.9702882 +0000 UTC m=+900.238936968" lastFinishedPulling="2025-12-01 09:30:44.891552335 +0000 UTC m=+962.160201103" observedRunningTime="2025-12-01 09:30:45.6918708 +0000 UTC m=+962.960519568" watchObservedRunningTime="2025-12-01 09:30:45.733584883 +0000 UTC m=+963.002233651" Dec 01 09:30:45 crc kubenswrapper[4763]: E1201 09:30:45.995782 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" podUID="6b7d748f-a9e4-416a-8fd7-9fa46ca2060d" Dec 01 09:30:47 crc kubenswrapper[4763]: I1201 09:30:47.119404 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6d555457c4-jcpzh" Dec 01 09:30:49 crc kubenswrapper[4763]: I1201 09:30:49.502584 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-l547d" Dec 01 09:30:49 crc kubenswrapper[4763]: I1201 09:30:49.833393 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-dbhs6" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.074273 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-t74fr" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.150082 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-9x2q4" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.343614 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-4srbr" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.441118 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-jbvjc" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.555185 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-9f5r9" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.674015 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-jxvhd" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.704042 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hjvw" event={"ID":"c11aee5c-47cc-4313-9380-0146961c94b8","Type":"ContainerStarted","Data":"119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2"} Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.706049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" event={"ID":"2aecf763-7a48-4c6f-a66c-ea391befd47a","Type":"ContainerStarted","Data":"665225a8025d94f91dac85f71a93ab0a167c82556ba0016536acb0cb55559cef"} Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.706097 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" event={"ID":"2aecf763-7a48-4c6f-a66c-ea391befd47a","Type":"ContainerStarted","Data":"b4e6a44de6b22f1c3aaba6422c4b5770563fadd3089cfa57cbcae2cc04afcc19"} Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.706207 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.707806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" event={"ID":"3f494774-a168-4199-bfff-e73f64a669cf","Type":"ContainerStarted","Data":"c746aa43ce7ebc106ec36df7e49ef240dbd5e3ed304178d89e8b8bb21169dd1c"} Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.708027 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.709797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" event={"ID":"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0","Type":"ContainerStarted","Data":"d7a865759b33877d943ba4ec5b964dc0048214594fa4e273645dd4cf55859b31"} Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.709828 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" event={"ID":"e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0","Type":"ContainerStarted","Data":"36b455639e040be577800f3fe32d3e81b71608bb98cc813ef56a7d3ba26ab9e9"} Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.709924 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.710824 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" event={"ID":"9eef10a1-bfcc-412c-9687-fee23d90d448","Type":"ContainerStarted","Data":"d8215b18403d1055a628bf425031856accdb6305418a37c62cf94bc84f6f8690"} Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.766640 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" podStartSLOduration=4.465398607 podStartE2EDuration="1m10.766619176s" podCreationTimestamp="2025-12-01 09:29:40 +0000 UTC" firstStartedPulling="2025-12-01 09:29:43.336412291 +0000 UTC m=+900.605061059" lastFinishedPulling="2025-12-01 09:30:49.63763286 +0000 UTC m=+966.906281628" observedRunningTime="2025-12-01 09:30:50.761957087 +0000 UTC m=+968.030605855" watchObservedRunningTime="2025-12-01 09:30:50.766619176 +0000 UTC m=+968.035267944" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.822284 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" podStartSLOduration=58.187869718 podStartE2EDuration="1m11.822267704s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:30:35.98934688 +0000 UTC m=+953.257995648" lastFinishedPulling="2025-12-01 09:30:49.623744856 +0000 UTC m=+966.892393634" observedRunningTime="2025-12-01 09:30:50.816527325 +0000 UTC m=+968.085176093" watchObservedRunningTime="2025-12-01 09:30:50.822267704 +0000 UTC m=+968.090916472" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.882659 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" podStartSLOduration=58.242910458 podStartE2EDuration="1m11.882644182s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:30:36.01071347 +0000 UTC m=+953.279362238" lastFinishedPulling="2025-12-01 09:30:49.650447194 +0000 UTC m=+966.919095962" observedRunningTime="2025-12-01 09:30:50.877445808 +0000 UTC m=+968.146094576" watchObservedRunningTime="2025-12-01 09:30:50.882644182 +0000 UTC m=+968.151292950" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.891426 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-tk5xf" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.915122 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-mqvfc" Dec 01 09:30:50 crc kubenswrapper[4763]: I1201 09:30:50.986632 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-2skdl" Dec 01 09:30:51 crc kubenswrapper[4763]: I1201 09:30:51.118288 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-64k9k" Dec 01 09:30:51 crc kubenswrapper[4763]: I1201 09:30:51.227761 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:30:51 crc kubenswrapper[4763]: I1201 09:30:51.227844 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:30:51 crc kubenswrapper[4763]: I1201 09:30:51.719770 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" event={"ID":"9eef10a1-bfcc-412c-9687-fee23d90d448","Type":"ContainerStarted","Data":"c154f104a80d867b870445d32158e50b4b08c4df6bdc0a39bd9ada9026e4a54f"} Dec 01 09:30:51 crc kubenswrapper[4763]: I1201 09:30:51.720028 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" Dec 01 09:30:51 crc kubenswrapper[4763]: I1201 09:30:51.721606 4763 generic.go:334] "Generic (PLEG): container finished" podID="c11aee5c-47cc-4313-9380-0146961c94b8" containerID="119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2" exitCode=0 Dec 01 09:30:51 crc kubenswrapper[4763]: I1201 09:30:51.721685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hjvw" event={"ID":"c11aee5c-47cc-4313-9380-0146961c94b8","Type":"ContainerDied","Data":"119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2"} Dec 01 09:30:51 crc kubenswrapper[4763]: I1201 09:30:51.758574 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" podStartSLOduration=5.34452684 podStartE2EDuration="1m12.758544625s" podCreationTimestamp="2025-12-01 09:29:39 +0000 UTC" firstStartedPulling="2025-12-01 09:29:42.412673439 +0000 UTC m=+899.681322217" lastFinishedPulling="2025-12-01 09:30:49.826691234 +0000 UTC m=+967.095340002" observedRunningTime="2025-12-01 09:30:51.745180616 +0000 UTC m=+969.013829404" watchObservedRunningTime="2025-12-01 09:30:51.758544625 +0000 UTC m=+969.027193393" Dec 01 09:30:52 crc kubenswrapper[4763]: I1201 09:30:52.297752 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-dmb2k" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="registry-server" probeResult="failure" output=< Dec 01 09:30:52 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:30:52 crc kubenswrapper[4763]: > Dec 01 09:30:52 crc kubenswrapper[4763]: I1201 09:30:52.733486 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hjvw" event={"ID":"c11aee5c-47cc-4313-9380-0146961c94b8","Type":"ContainerStarted","Data":"630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf"} Dec 01 09:30:52 crc kubenswrapper[4763]: I1201 09:30:52.749532 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7hjvw" podStartSLOduration=5.485051129 podStartE2EDuration="1m10.749482676s" podCreationTimestamp="2025-12-01 09:29:42 +0000 UTC" firstStartedPulling="2025-12-01 09:29:46.995616326 +0000 UTC m=+904.264265104" lastFinishedPulling="2025-12-01 09:30:52.260047843 +0000 UTC m=+969.528696651" observedRunningTime="2025-12-01 09:30:52.748091628 +0000 UTC m=+970.016740396" watchObservedRunningTime="2025-12-01 09:30:52.749482676 +0000 UTC m=+970.018131444" Dec 01 09:30:53 crc kubenswrapper[4763]: I1201 09:30:53.169014 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:30:53 crc kubenswrapper[4763]: I1201 09:30:53.169071 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:30:54 crc kubenswrapper[4763]: I1201 09:30:54.214430 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7hjvw" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="registry-server" probeResult="failure" output=< Dec 01 09:30:54 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:30:54 crc kubenswrapper[4763]: > Dec 01 09:30:59 crc kubenswrapper[4763]: I1201 09:30:59.799302 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" event={"ID":"6b7d748f-a9e4-416a-8fd7-9fa46ca2060d","Type":"ContainerStarted","Data":"bb2b6939cec8b7b3097ae67545e5758570ae2f916896f1eb16fc0d622c4cbcf0"} Dec 01 09:30:59 crc kubenswrapper[4763]: I1201 09:30:59.815897 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-s4rlp" podStartSLOduration=4.425871418 podStartE2EDuration="1m19.815881937s" podCreationTimestamp="2025-12-01 09:29:40 +0000 UTC" firstStartedPulling="2025-12-01 09:29:43.351583982 +0000 UTC m=+900.620232750" lastFinishedPulling="2025-12-01 09:30:58.741594501 +0000 UTC m=+976.010243269" observedRunningTime="2025-12-01 09:30:59.815349232 +0000 UTC m=+977.083998000" watchObservedRunningTime="2025-12-01 09:30:59.815881937 +0000 UTC m=+977.084530705" Dec 01 09:31:00 crc kubenswrapper[4763]: I1201 09:31:00.205181 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-nb5kc" Dec 01 09:31:01 crc kubenswrapper[4763]: I1201 09:31:01.172000 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-ctr5d" Dec 01 09:31:01 crc kubenswrapper[4763]: I1201 09:31:01.281892 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:31:01 crc kubenswrapper[4763]: I1201 09:31:01.323085 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:31:01 crc kubenswrapper[4763]: I1201 09:31:01.516626 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dmb2k"] Dec 01 09:31:01 crc kubenswrapper[4763]: I1201 09:31:01.672073 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-9vjrk" Dec 01 09:31:02 crc kubenswrapper[4763]: I1201 09:31:02.320312 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj" Dec 01 09:31:02 crc kubenswrapper[4763]: I1201 09:31:02.819688 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dmb2k" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="registry-server" containerID="cri-o://8bb3123aa991b4fc0ed2520266e9427ad66226e2b52d83218096a15dcd6e2738" gracePeriod=2 Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.217919 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.296804 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7hjvw" Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.772548 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hjvw"] Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.836662 4763 generic.go:334] "Generic (PLEG): container finished" podID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerID="8bb3123aa991b4fc0ed2520266e9427ad66226e2b52d83218096a15dcd6e2738" exitCode=0 Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.837353 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmb2k" event={"ID":"96e0bbed-711d-48ca-bf8c-fa678c91c3de","Type":"ContainerDied","Data":"8bb3123aa991b4fc0ed2520266e9427ad66226e2b52d83218096a15dcd6e2738"} Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.922581 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f454l"] Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.923200 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f454l" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerName="registry-server" containerID="cri-o://7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d" gracePeriod=2 Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.929195 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:31:03 crc kubenswrapper[4763]: I1201 09:31:03.929263 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.022584 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.146244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8rzk\" (UniqueName: \"kubernetes.io/projected/96e0bbed-711d-48ca-bf8c-fa678c91c3de-kube-api-access-j8rzk\") pod \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.146349 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-catalog-content\") pod \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.146438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-utilities\") pod \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\" (UID: \"96e0bbed-711d-48ca-bf8c-fa678c91c3de\") " Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.147291 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-utilities" (OuterVolumeSpecName: "utilities") pod "96e0bbed-711d-48ca-bf8c-fa678c91c3de" (UID: "96e0bbed-711d-48ca-bf8c-fa678c91c3de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.153799 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e0bbed-711d-48ca-bf8c-fa678c91c3de-kube-api-access-j8rzk" (OuterVolumeSpecName: "kube-api-access-j8rzk") pod "96e0bbed-711d-48ca-bf8c-fa678c91c3de" (UID: "96e0bbed-711d-48ca-bf8c-fa678c91c3de"). InnerVolumeSpecName "kube-api-access-j8rzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.195506 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96e0bbed-711d-48ca-bf8c-fa678c91c3de" (UID: "96e0bbed-711d-48ca-bf8c-fa678c91c3de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.247449 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.247494 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8rzk\" (UniqueName: \"kubernetes.io/projected/96e0bbed-711d-48ca-bf8c-fa678c91c3de-kube-api-access-j8rzk\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.247507 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e0bbed-711d-48ca-bf8c-fa678c91c3de-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.510783 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f454l" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.653949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-catalog-content\") pod \"42e4fb80-00e7-430a-8154-e4e581437bfe\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.654086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-utilities\") pod \"42e4fb80-00e7-430a-8154-e4e581437bfe\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.654195 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxtld\" (UniqueName: \"kubernetes.io/projected/42e4fb80-00e7-430a-8154-e4e581437bfe-kube-api-access-sxtld\") pod \"42e4fb80-00e7-430a-8154-e4e581437bfe\" (UID: \"42e4fb80-00e7-430a-8154-e4e581437bfe\") " Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.655368 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-utilities" (OuterVolumeSpecName: "utilities") pod "42e4fb80-00e7-430a-8154-e4e581437bfe" (UID: "42e4fb80-00e7-430a-8154-e4e581437bfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.668296 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e4fb80-00e7-430a-8154-e4e581437bfe-kube-api-access-sxtld" (OuterVolumeSpecName: "kube-api-access-sxtld") pod "42e4fb80-00e7-430a-8154-e4e581437bfe" (UID: "42e4fb80-00e7-430a-8154-e4e581437bfe"). InnerVolumeSpecName "kube-api-access-sxtld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.700366 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42e4fb80-00e7-430a-8154-e4e581437bfe" (UID: "42e4fb80-00e7-430a-8154-e4e581437bfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.755436 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxtld\" (UniqueName: \"kubernetes.io/projected/42e4fb80-00e7-430a-8154-e4e581437bfe-kube-api-access-sxtld\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.755517 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.755535 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e4fb80-00e7-430a-8154-e4e581437bfe-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.852482 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmb2k" event={"ID":"96e0bbed-711d-48ca-bf8c-fa678c91c3de","Type":"ContainerDied","Data":"cf51dd2f0ac412aca971b4391030b2b09e90586e1b72a7cb33e45e468b88df70"} Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.852527 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmb2k" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.852550 4763 scope.go:117] "RemoveContainer" containerID="8bb3123aa991b4fc0ed2520266e9427ad66226e2b52d83218096a15dcd6e2738" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.861578 4763 generic.go:334] "Generic (PLEG): container finished" podID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerID="7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d" exitCode=0 Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.861640 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f454l" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.861686 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f454l" event={"ID":"42e4fb80-00e7-430a-8154-e4e581437bfe","Type":"ContainerDied","Data":"7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d"} Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.861717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f454l" event={"ID":"42e4fb80-00e7-430a-8154-e4e581437bfe","Type":"ContainerDied","Data":"d7a2b2ff421d8ab63b41595814424dc6bc3b41d1bf393659160ebeb72cad6008"} Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.875665 4763 scope.go:117] "RemoveContainer" containerID="bef500e45be5aad41eb62c6bf57a86c434a3f5ea2d784bad5384e9821afe9292" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.888063 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dmb2k"] Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.902062 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dmb2k"] Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.905085 4763 scope.go:117] "RemoveContainer" containerID="b7543c0d2746dc176b4e6426b0790d34897675b5378c7a12473fd9c86d444347" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.908998 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f454l"] Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.923642 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f454l"] Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.933404 4763 scope.go:117] "RemoveContainer" containerID="7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.952731 4763 scope.go:117] "RemoveContainer" containerID="e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f" Dec 01 09:31:04 crc kubenswrapper[4763]: I1201 09:31:04.988839 4763 scope.go:117] "RemoveContainer" containerID="51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3" Dec 01 09:31:05 crc kubenswrapper[4763]: I1201 09:31:05.003317 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" path="/var/lib/kubelet/pods/42e4fb80-00e7-430a-8154-e4e581437bfe/volumes" Dec 01 09:31:05 crc kubenswrapper[4763]: I1201 09:31:05.004092 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" path="/var/lib/kubelet/pods/96e0bbed-711d-48ca-bf8c-fa678c91c3de/volumes" Dec 01 09:31:05 crc kubenswrapper[4763]: I1201 09:31:05.010065 4763 scope.go:117] "RemoveContainer" containerID="7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d" Dec 01 09:31:05 crc kubenswrapper[4763]: E1201 09:31:05.010421 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d\": container with ID starting with 7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d not found: ID does not exist" containerID="7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d" Dec 01 09:31:05 crc kubenswrapper[4763]: I1201 09:31:05.010448 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d"} err="failed to get container status \"7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d\": rpc error: code = NotFound desc = could not find container \"7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d\": container with ID starting with 7647a6e0e4a369cf9b8beb1af1dc941a7b7bb26d4e07b9e15747246b79502d8d not found: ID does not exist" Dec 01 09:31:05 crc kubenswrapper[4763]: I1201 09:31:05.010481 4763 scope.go:117] "RemoveContainer" containerID="e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f" Dec 01 09:31:05 crc kubenswrapper[4763]: E1201 09:31:05.010847 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f\": container with ID starting with e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f not found: ID does not exist" containerID="e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f" Dec 01 09:31:05 crc kubenswrapper[4763]: I1201 09:31:05.010871 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f"} err="failed to get container status \"e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f\": rpc error: code = NotFound desc = could not find container \"e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f\": container with ID starting with e63525a74fb97b9e084f485e9d5034068b2fe32fa8f36dc430408e3323b8869f not found: ID does not exist" Dec 01 09:31:05 crc kubenswrapper[4763]: I1201 09:31:05.010888 4763 scope.go:117] "RemoveContainer" containerID="51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3" Dec 01 09:31:05 crc kubenswrapper[4763]: E1201 09:31:05.011212 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3\": container with ID starting with 51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3 not found: ID does not exist" containerID="51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3" Dec 01 09:31:05 crc kubenswrapper[4763]: I1201 09:31:05.011276 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3"} err="failed to get container status \"51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3\": rpc error: code = NotFound desc = could not find container \"51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3\": container with ID starting with 51950933d4257a10cdef955d52d1b97226d365ee59319f1f889463ecd25da8e3 not found: ID does not exist" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.608239 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p2mqz"] Dec 01 09:31:18 crc kubenswrapper[4763]: E1201 09:31:18.609113 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerName="extract-content" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609128 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerName="extract-content" Dec 01 09:31:18 crc kubenswrapper[4763]: E1201 09:31:18.609151 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerName="extract-utilities" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609159 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerName="extract-utilities" Dec 01 09:31:18 crc kubenswrapper[4763]: E1201 09:31:18.609182 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="extract-utilities" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609190 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="extract-utilities" Dec 01 09:31:18 crc kubenswrapper[4763]: E1201 09:31:18.609212 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="registry-server" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609220 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="registry-server" Dec 01 09:31:18 crc kubenswrapper[4763]: E1201 09:31:18.609240 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerName="registry-server" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609249 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerName="registry-server" Dec 01 09:31:18 crc kubenswrapper[4763]: E1201 09:31:18.609271 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa0ec16-0f6b-4b6b-894c-31c949e95498" containerName="collect-profiles" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609278 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa0ec16-0f6b-4b6b-894c-31c949e95498" containerName="collect-profiles" Dec 01 09:31:18 crc kubenswrapper[4763]: E1201 09:31:18.609295 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="extract-content" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609303 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="extract-content" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609436 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e4fb80-00e7-430a-8154-e4e581437bfe" containerName="registry-server" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609470 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e0bbed-711d-48ca-bf8c-fa678c91c3de" containerName="registry-server" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.609481 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aa0ec16-0f6b-4b6b-894c-31c949e95498" containerName="collect-profiles" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.610149 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.612036 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.612084 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.613850 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tpc5x" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.623424 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.633914 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p2mqz"] Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.661889 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-config\") pod \"dnsmasq-dns-675f4bcbfc-p2mqz\" (UID: \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.661961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6z5r\" (UniqueName: \"kubernetes.io/projected/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-kube-api-access-j6z5r\") pod \"dnsmasq-dns-675f4bcbfc-p2mqz\" (UID: \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.682047 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lk6j8"] Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.683128 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.684813 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 01 09:31:18 crc kubenswrapper[4763]: I1201 09:31:18.703122 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lk6j8"] Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.763246 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.763295 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-config\") pod \"dnsmasq-dns-675f4bcbfc-p2mqz\" (UID: \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.763341 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-config\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.763363 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6z5r\" (UniqueName: \"kubernetes.io/projected/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-kube-api-access-j6z5r\") pod \"dnsmasq-dns-675f4bcbfc-p2mqz\" (UID: \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.763413 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fxlk\" (UniqueName: \"kubernetes.io/projected/77faddf8-abb3-4619-bffe-6b35b76e4c2c-kube-api-access-9fxlk\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.764538 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-config\") pod \"dnsmasq-dns-675f4bcbfc-p2mqz\" (UID: \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.785328 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6z5r\" (UniqueName: \"kubernetes.io/projected/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-kube-api-access-j6z5r\") pod \"dnsmasq-dns-675f4bcbfc-p2mqz\" (UID: \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.864562 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-config\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.864624 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fxlk\" (UniqueName: \"kubernetes.io/projected/77faddf8-abb3-4619-bffe-6b35b76e4c2c-kube-api-access-9fxlk\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.864676 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.865445 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.865482 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-config\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.881304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fxlk\" (UniqueName: \"kubernetes.io/projected/77faddf8-abb3-4619-bffe-6b35b76e4c2c-kube-api-access-9fxlk\") pod \"dnsmasq-dns-78dd6ddcc-lk6j8\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.926867 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:18.995923 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:19.415390 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p2mqz"] Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:19.520481 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lk6j8"] Dec 01 09:31:19 crc kubenswrapper[4763]: W1201 09:31:19.522337 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77faddf8_abb3_4619_bffe_6b35b76e4c2c.slice/crio-09437d938b3669f61586bc626bcbae8bd0d354dd6d0d0d0b3670e73ad66343c9 WatchSource:0}: Error finding container 09437d938b3669f61586bc626bcbae8bd0d354dd6d0d0d0b3670e73ad66343c9: Status 404 returned error can't find the container with id 09437d938b3669f61586bc626bcbae8bd0d354dd6d0d0d0b3670e73ad66343c9 Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:19.977905 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" event={"ID":"bc0325bc-7b6a-4593-8d7c-8fd100c2239f","Type":"ContainerStarted","Data":"0ce430195e47ca974ecb6e191e89d541f6e9f0e474b624ccb2e37637b3aa803d"} Dec 01 09:31:19 crc kubenswrapper[4763]: I1201 09:31:19.979876 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" event={"ID":"77faddf8-abb3-4619-bffe-6b35b76e4c2c","Type":"ContainerStarted","Data":"09437d938b3669f61586bc626bcbae8bd0d354dd6d0d0d0b3670e73ad66343c9"} Dec 01 09:31:21 crc kubenswrapper[4763]: I1201 09:31:21.739759 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p2mqz"] Dec 01 09:31:21 crc kubenswrapper[4763]: I1201 09:31:21.768574 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6tlks"] Dec 01 09:31:21 crc kubenswrapper[4763]: I1201 09:31:21.770064 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:21 crc kubenswrapper[4763]: I1201 09:31:21.781009 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6tlks"] Dec 01 09:31:21 crc kubenswrapper[4763]: I1201 09:31:21.911557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-config\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:21 crc kubenswrapper[4763]: I1201 09:31:21.911644 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:21 crc kubenswrapper[4763]: I1201 09:31:21.911676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr6t4\" (UniqueName: \"kubernetes.io/projected/26ceb679-309a-4cf5-8fe6-3203868faf07-kube-api-access-tr6t4\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.017709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-config\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.017807 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.017834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr6t4\" (UniqueName: \"kubernetes.io/projected/26ceb679-309a-4cf5-8fe6-3203868faf07-kube-api-access-tr6t4\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.019587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-config\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.019782 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.073938 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr6t4\" (UniqueName: \"kubernetes.io/projected/26ceb679-309a-4cf5-8fe6-3203868faf07-kube-api-access-tr6t4\") pod \"dnsmasq-dns-666b6646f7-6tlks\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.095716 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.105863 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lk6j8"] Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.143731 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ccnfp"] Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.145138 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.180230 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ccnfp"] Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.223291 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-config\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.223677 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69cvr\" (UniqueName: \"kubernetes.io/projected/0be30398-4337-4667-a5e3-439ebfe1c01c-kube-api-access-69cvr\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.223767 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.325392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.325515 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-config\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.325540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69cvr\" (UniqueName: \"kubernetes.io/projected/0be30398-4337-4667-a5e3-439ebfe1c01c-kube-api-access-69cvr\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.327107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.327592 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-config\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.392774 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69cvr\" (UniqueName: \"kubernetes.io/projected/0be30398-4337-4667-a5e3-439ebfe1c01c-kube-api-access-69cvr\") pod \"dnsmasq-dns-57d769cc4f-ccnfp\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.542494 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.816499 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6tlks"] Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.920334 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.922895 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.932721 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.932899 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.932967 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.933126 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.936726 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.936958 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ncncw" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.937153 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 09:31:22 crc kubenswrapper[4763]: I1201 09:31:22.962972 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037550 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037657 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56f133f4-8bf0-4c02-add2-37f41b8904cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037688 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56f133f4-8bf0-4c02-add2-37f41b8904cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037757 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037771 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037803 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rpg\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-kube-api-access-c4rpg\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037821 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037844 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.037858 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.087437 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6tlks" event={"ID":"26ceb679-309a-4cf5-8fe6-3203868faf07","Type":"ContainerStarted","Data":"21dee80c2124bb3262d424f4a0ed134e21c7a54af1404dfef7b7e03249a88a54"} Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.139733 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56f133f4-8bf0-4c02-add2-37f41b8904cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.139819 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.139851 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.139907 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56f133f4-8bf0-4c02-add2-37f41b8904cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.139938 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.139985 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.140028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rpg\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-kube-api-access-c4rpg\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.140052 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.140107 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.140129 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.140180 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.141211 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.141405 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.142301 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.144028 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.144291 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-config-data\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.144612 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.148053 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56f133f4-8bf0-4c02-add2-37f41b8904cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.150832 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.151513 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56f133f4-8bf0-4c02-add2-37f41b8904cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.158241 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.160159 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rpg\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-kube-api-access-c4rpg\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.177903 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.252392 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.277520 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ccnfp"] Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.345972 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.347698 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.349953 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5tks6" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.350149 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.350279 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.350429 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.350638 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.350741 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.350843 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.354133 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449000 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449079 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449133 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53cf9c04-a52d-4827-a700-98ca02183344-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449160 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449205 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449281 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxwvk\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-kube-api-access-lxwvk\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449376 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449482 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.449500 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53cf9c04-a52d-4827-a700-98ca02183344-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.551216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.552386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.552653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53cf9c04-a52d-4827-a700-98ca02183344-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.552907 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.552946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.553114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53cf9c04-a52d-4827-a700-98ca02183344-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.553152 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.553310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.553460 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.553519 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxwvk\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-kube-api-access-lxwvk\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.553549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.552022 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.559521 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.560254 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.560641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.560818 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.562231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.562436 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.564655 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53cf9c04-a52d-4827-a700-98ca02183344-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.569918 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.584055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxwvk\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-kube-api-access-lxwvk\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.601876 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53cf9c04-a52d-4827-a700-98ca02183344-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.609771 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.704945 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:31:23 crc kubenswrapper[4763]: I1201 09:31:23.953113 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:31:23 crc kubenswrapper[4763]: W1201 09:31:23.999825 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56f133f4_8bf0_4c02_add2_37f41b8904cc.slice/crio-43b26529fb8204e56a85168bbb47bbbf967a593161cdbda0b7f12effb921e118 WatchSource:0}: Error finding container 43b26529fb8204e56a85168bbb47bbbf967a593161cdbda0b7f12effb921e118: Status 404 returned error can't find the container with id 43b26529fb8204e56a85168bbb47bbbf967a593161cdbda0b7f12effb921e118 Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.102965 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56f133f4-8bf0-4c02-add2-37f41b8904cc","Type":"ContainerStarted","Data":"43b26529fb8204e56a85168bbb47bbbf967a593161cdbda0b7f12effb921e118"} Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.104151 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" event={"ID":"0be30398-4337-4667-a5e3-439ebfe1c01c","Type":"ContainerStarted","Data":"b442466e817dc3374361347f5c75db81d6c32e015bb7a08cbda3e7b9c8b91b39"} Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.279118 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:31:24 crc kubenswrapper[4763]: W1201 09:31:24.319690 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53cf9c04_a52d_4827_a700_98ca02183344.slice/crio-f0e119b7582423098a7ffffb93f21127f86fdb3558bafdd852696d092acedfa6 WatchSource:0}: Error finding container f0e119b7582423098a7ffffb93f21127f86fdb3558bafdd852696d092acedfa6: Status 404 returned error can't find the container with id f0e119b7582423098a7ffffb93f21127f86fdb3558bafdd852696d092acedfa6 Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.688441 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.690630 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.693641 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.694151 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-sx2f7" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.695539 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.697748 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.699421 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.710726 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.787697 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-kolla-config\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.787752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-config-data-default\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.787790 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.787837 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fc05c35a-b504-4104-a515-737272f6b4d9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.787883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc05c35a-b504-4104-a515-737272f6b4d9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.787923 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc05c35a-b504-4104-a515-737272f6b4d9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.787941 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.787956 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxz7s\" (UniqueName: \"kubernetes.io/projected/fc05c35a-b504-4104-a515-737272f6b4d9-kube-api-access-wxz7s\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.889533 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.889621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fc05c35a-b504-4104-a515-737272f6b4d9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.889699 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc05c35a-b504-4104-a515-737272f6b4d9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.889762 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc05c35a-b504-4104-a515-737272f6b4d9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.889791 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.889813 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxz7s\" (UniqueName: \"kubernetes.io/projected/fc05c35a-b504-4104-a515-737272f6b4d9-kube-api-access-wxz7s\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.889847 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-kolla-config\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.889871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-config-data-default\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.891299 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-config-data-default\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.891584 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.892435 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.893708 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc05c35a-b504-4104-a515-737272f6b4d9-kolla-config\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.894136 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fc05c35a-b504-4104-a515-737272f6b4d9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.909684 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc05c35a-b504-4104-a515-737272f6b4d9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.912690 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxz7s\" (UniqueName: \"kubernetes.io/projected/fc05c35a-b504-4104-a515-737272f6b4d9-kube-api-access-wxz7s\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.924913 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:24 crc kubenswrapper[4763]: I1201 09:31:24.930106 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc05c35a-b504-4104-a515-737272f6b4d9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fc05c35a-b504-4104-a515-737272f6b4d9\") " pod="openstack/openstack-galera-0" Dec 01 09:31:25 crc kubenswrapper[4763]: I1201 09:31:25.025442 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 01 09:31:25 crc kubenswrapper[4763]: I1201 09:31:25.142364 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53cf9c04-a52d-4827-a700-98ca02183344","Type":"ContainerStarted","Data":"f0e119b7582423098a7ffffb93f21127f86fdb3558bafdd852696d092acedfa6"} Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.219868 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.224759 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.228889 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.229097 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.229223 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rxb72" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.229784 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.233761 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.330546 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.330875 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gndz9\" (UniqueName: \"kubernetes.io/projected/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-kube-api-access-gndz9\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.331008 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.331188 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.331320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.331450 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.331597 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.331710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.428406 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.429612 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433121 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433182 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433222 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433252 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gndz9\" (UniqueName: \"kubernetes.io/projected/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-kube-api-access-gndz9\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433284 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433364 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433395 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.433888 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.434942 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.434950 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.435050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.435380 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.435434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.436509 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.436769 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-b228r" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.452656 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.467007 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.484722 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.491703 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.507324 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gndz9\" (UniqueName: \"kubernetes.io/projected/7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e-kube-api-access-gndz9\") pod \"openstack-cell1-galera-0\" (UID: \"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e\") " pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.542120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f17760ee-44e7-4bf5-b9d6-368f9b780426-config-data\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.542423 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17760ee-44e7-4bf5-b9d6-368f9b780426-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.542733 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f17760ee-44e7-4bf5-b9d6-368f9b780426-kolla-config\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.542865 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f17760ee-44e7-4bf5-b9d6-368f9b780426-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.543099 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22tn8\" (UniqueName: \"kubernetes.io/projected/f17760ee-44e7-4bf5-b9d6-368f9b780426-kube-api-access-22tn8\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.561652 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.566018 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.644969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f17760ee-44e7-4bf5-b9d6-368f9b780426-config-data\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.645018 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17760ee-44e7-4bf5-b9d6-368f9b780426-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.645042 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f17760ee-44e7-4bf5-b9d6-368f9b780426-kolla-config\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.645068 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f17760ee-44e7-4bf5-b9d6-368f9b780426-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.645147 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22tn8\" (UniqueName: \"kubernetes.io/projected/f17760ee-44e7-4bf5-b9d6-368f9b780426-kube-api-access-22tn8\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.646411 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f17760ee-44e7-4bf5-b9d6-368f9b780426-config-data\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.646480 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f17760ee-44e7-4bf5-b9d6-368f9b780426-kolla-config\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: W1201 09:31:26.652521 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc05c35a_b504_4104_a515_737272f6b4d9.slice/crio-e8fbe65cc11d32978140c60ceb3dccddebfa81ae55652bf142f0f9730a27f4d6 WatchSource:0}: Error finding container e8fbe65cc11d32978140c60ceb3dccddebfa81ae55652bf142f0f9730a27f4d6: Status 404 returned error can't find the container with id e8fbe65cc11d32978140c60ceb3dccddebfa81ae55652bf142f0f9730a27f4d6 Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.659220 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17760ee-44e7-4bf5-b9d6-368f9b780426-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.659319 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f17760ee-44e7-4bf5-b9d6-368f9b780426-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.666352 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22tn8\" (UniqueName: \"kubernetes.io/projected/f17760ee-44e7-4bf5-b9d6-368f9b780426-kube-api-access-22tn8\") pod \"memcached-0\" (UID: \"f17760ee-44e7-4bf5-b9d6-368f9b780426\") " pod="openstack/memcached-0" Dec 01 09:31:26 crc kubenswrapper[4763]: I1201 09:31:26.860186 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 01 09:31:27 crc kubenswrapper[4763]: W1201 09:31:27.040652 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ad5d50f_9d2b_49ea_b2e2_3a03fbb3d17e.slice/crio-298bcdbd17a2d55d07491eae5d8282511611a75fb8217c7366c0e23699082f3c WatchSource:0}: Error finding container 298bcdbd17a2d55d07491eae5d8282511611a75fb8217c7366c0e23699082f3c: Status 404 returned error can't find the container with id 298bcdbd17a2d55d07491eae5d8282511611a75fb8217c7366c0e23699082f3c Dec 01 09:31:27 crc kubenswrapper[4763]: I1201 09:31:27.069094 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 01 09:31:27 crc kubenswrapper[4763]: I1201 09:31:27.169816 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e","Type":"ContainerStarted","Data":"298bcdbd17a2d55d07491eae5d8282511611a75fb8217c7366c0e23699082f3c"} Dec 01 09:31:27 crc kubenswrapper[4763]: I1201 09:31:27.171975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fc05c35a-b504-4104-a515-737272f6b4d9","Type":"ContainerStarted","Data":"e8fbe65cc11d32978140c60ceb3dccddebfa81ae55652bf142f0f9730a27f4d6"} Dec 01 09:31:27 crc kubenswrapper[4763]: I1201 09:31:27.516559 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 01 09:31:28 crc kubenswrapper[4763]: I1201 09:31:28.640354 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:28 crc kubenswrapper[4763]: I1201 09:31:28.642437 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:31:28 crc kubenswrapper[4763]: I1201 09:31:28.655966 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mxfv8" Dec 01 09:31:28 crc kubenswrapper[4763]: I1201 09:31:28.662666 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:28 crc kubenswrapper[4763]: I1201 09:31:28.793237 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfsvs\" (UniqueName: \"kubernetes.io/projected/f606a872-f73e-4245-bf9f-75b4a90dc12f-kube-api-access-mfsvs\") pod \"kube-state-metrics-0\" (UID: \"f606a872-f73e-4245-bf9f-75b4a90dc12f\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:28 crc kubenswrapper[4763]: I1201 09:31:28.895396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfsvs\" (UniqueName: \"kubernetes.io/projected/f606a872-f73e-4245-bf9f-75b4a90dc12f-kube-api-access-mfsvs\") pod \"kube-state-metrics-0\" (UID: \"f606a872-f73e-4245-bf9f-75b4a90dc12f\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:28 crc kubenswrapper[4763]: I1201 09:31:28.940371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfsvs\" (UniqueName: \"kubernetes.io/projected/f606a872-f73e-4245-bf9f-75b4a90dc12f-kube-api-access-mfsvs\") pod \"kube-state-metrics-0\" (UID: \"f606a872-f73e-4245-bf9f-75b4a90dc12f\") " pod="openstack/kube-state-metrics-0" Dec 01 09:31:28 crc kubenswrapper[4763]: I1201 09:31:28.970071 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:31:31 crc kubenswrapper[4763]: I1201 09:31:31.968081 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 09:31:31 crc kubenswrapper[4763]: I1201 09:31:31.972906 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:31.999851 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.000136 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jkwzt" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.000309 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.000462 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.001133 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.003450 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.046692 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fe31b02-1b96-439f-bc58-9d2d2700d35b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.046988 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.047072 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.047306 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8z5l\" (UniqueName: \"kubernetes.io/projected/0fe31b02-1b96-439f-bc58-9d2d2700d35b-kube-api-access-x8z5l\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.047398 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fe31b02-1b96-439f-bc58-9d2d2700d35b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.047529 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fe31b02-1b96-439f-bc58-9d2d2700d35b-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.048584 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.048732 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.105154 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-26n5d"] Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.106354 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.109987 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ng4jl" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.110330 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.110528 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.131069 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26n5d"] Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.150888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-scripts\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.151027 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fe31b02-1b96-439f-bc58-9d2d2700d35b-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.154935 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0fe31b02-1b96-439f-bc58-9d2d2700d35b-config\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.155103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.156237 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.156387 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6f5q\" (UniqueName: \"kubernetes.io/projected/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-kube-api-access-n6f5q\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.156611 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fe31b02-1b96-439f-bc58-9d2d2700d35b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.156703 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-combined-ca-bundle\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.156794 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.157005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.157853 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8z5l\" (UniqueName: \"kubernetes.io/projected/0fe31b02-1b96-439f-bc58-9d2d2700d35b-kube-api-access-x8z5l\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.158119 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fe31b02-1b96-439f-bc58-9d2d2700d35b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.158232 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-log-ovn\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.161658 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-run-ovn\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.164922 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-ovn-controller-tls-certs\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.165120 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-run\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.161020 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0fe31b02-1b96-439f-bc58-9d2d2700d35b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.161608 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.157612 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.160401 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.157598 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0fe31b02-1b96-439f-bc58-9d2d2700d35b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.172856 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fe31b02-1b96-439f-bc58-9d2d2700d35b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.213095 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8z5l\" (UniqueName: \"kubernetes.io/projected/0fe31b02-1b96-439f-bc58-9d2d2700d35b-kube-api-access-x8z5l\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.230841 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-d2z4q"] Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.269275 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-log-ovn\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.269325 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-run-ovn\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.269348 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-ovn-controller-tls-certs\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.269378 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-run\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.269425 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-scripts\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.269558 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6f5q\" (UniqueName: \"kubernetes.io/projected/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-kube-api-access-n6f5q\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.269630 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-combined-ca-bundle\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.268076 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.272312 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-run\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.272717 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-run-ovn\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.284243 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-combined-ca-bundle\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.290825 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-scripts\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.290957 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-var-log-ovn\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.297150 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-ovn-controller-tls-certs\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.297923 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6f5q\" (UniqueName: \"kubernetes.io/projected/68a1c130-7d5e-4679-9ec7-dd63b84cc8d5-kube-api-access-n6f5q\") pod \"ovn-controller-26n5d\" (UID: \"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5\") " pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.299900 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f17760ee-44e7-4bf5-b9d6-368f9b780426","Type":"ContainerStarted","Data":"1fc26a0a7f4ca781bcc1a49ba22c2a7380cb3ea2d961738a81ec8a21fe5280b0"} Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.304281 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-d2z4q"] Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.307704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0fe31b02-1b96-439f-bc58-9d2d2700d35b\") " pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.372863 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-lib\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.373012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-log\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.373076 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-run\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.373102 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-etc-ovs\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.373132 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcndf\" (UniqueName: \"kubernetes.io/projected/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-kube-api-access-gcndf\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.373182 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-scripts\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.424145 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26n5d" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.479912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-etc-ovs\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480321 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcndf\" (UniqueName: \"kubernetes.io/projected/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-kube-api-access-gcndf\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480348 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-scripts\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-lib\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-log\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-run\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480820 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-log\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480833 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-lib\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480966 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-etc-ovs\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.480621 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-var-run\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.484934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-scripts\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.509831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcndf\" (UniqueName: \"kubernetes.io/projected/ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7-kube-api-access-gcndf\") pod \"ovn-controller-ovs-d2z4q\" (UID: \"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7\") " pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.601083 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 01 09:31:32 crc kubenswrapper[4763]: I1201 09:31:32.650394 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:31:33 crc kubenswrapper[4763]: I1201 09:31:33.929020 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:31:33 crc kubenswrapper[4763]: I1201 09:31:33.929156 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.701371 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.702900 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.706252 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.706421 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.707379 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-wcmfw" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.707961 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.709545 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.840731 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.840852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.840879 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.840928 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aab79d8-d046-4877-9fa0-12d87132a99f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.840966 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aab79d8-d046-4877-9fa0-12d87132a99f-config\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.841101 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x859j\" (UniqueName: \"kubernetes.io/projected/2aab79d8-d046-4877-9fa0-12d87132a99f-kube-api-access-x859j\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.841167 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aab79d8-d046-4877-9fa0-12d87132a99f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.841193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.942486 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aab79d8-d046-4877-9fa0-12d87132a99f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.942560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aab79d8-d046-4877-9fa0-12d87132a99f-config\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.942580 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x859j\" (UniqueName: \"kubernetes.io/projected/2aab79d8-d046-4877-9fa0-12d87132a99f-kube-api-access-x859j\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.942606 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aab79d8-d046-4877-9fa0-12d87132a99f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.942623 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.942658 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.942735 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.942768 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.943061 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aab79d8-d046-4877-9fa0-12d87132a99f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.943084 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.943432 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aab79d8-d046-4877-9fa0-12d87132a99f-config\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.943736 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aab79d8-d046-4877-9fa0-12d87132a99f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.952858 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.953680 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.957076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aab79d8-d046-4877-9fa0-12d87132a99f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.973913 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x859j\" (UniqueName: \"kubernetes.io/projected/2aab79d8-d046-4877-9fa0-12d87132a99f-kube-api-access-x859j\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:35 crc kubenswrapper[4763]: I1201 09:31:35.976340 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2aab79d8-d046-4877-9fa0-12d87132a99f\") " pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:36 crc kubenswrapper[4763]: I1201 09:31:36.025301 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 01 09:31:45 crc kubenswrapper[4763]: E1201 09:31:45.383348 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 09:31:45 crc kubenswrapper[4763]: E1201 09:31:45.384078 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tr6t4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-6tlks_openstack(26ceb679-309a-4cf5-8fe6-3203868faf07): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:31:45 crc kubenswrapper[4763]: E1201 09:31:45.385262 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-6tlks" podUID="26ceb679-309a-4cf5-8fe6-3203868faf07" Dec 01 09:31:45 crc kubenswrapper[4763]: E1201 09:31:45.404448 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-6tlks" podUID="26ceb679-309a-4cf5-8fe6-3203868faf07" Dec 01 09:31:52 crc kubenswrapper[4763]: E1201 09:31:52.731889 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 01 09:31:52 crc kubenswrapper[4763]: E1201 09:31:52.732435 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5f6hbh66ch58bh66h6bh88hf5h8dhc7h58bh599hf9h656hd9h657hc6h6bh687h6ch5d5h576h57dh668h646h569h598h8bhf8hc6h8ch555q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22tn8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(f17760ee-44e7-4bf5-b9d6-368f9b780426): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:31:52 crc kubenswrapper[4763]: E1201 09:31:52.733810 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="f17760ee-44e7-4bf5-b9d6-368f9b780426" Dec 01 09:31:52 crc kubenswrapper[4763]: E1201 09:31:52.910688 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 09:31:52 crc kubenswrapper[4763]: E1201 09:31:52.911088 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9fxlk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-lk6j8_openstack(77faddf8-abb3-4619-bffe-6b35b76e4c2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:31:52 crc kubenswrapper[4763]: E1201 09:31:52.912990 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" podUID="77faddf8-abb3-4619-bffe-6b35b76e4c2c" Dec 01 09:31:53 crc kubenswrapper[4763]: I1201 09:31:53.333252 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:31:53 crc kubenswrapper[4763]: W1201 09:31:53.337542 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf606a872_f73e_4245_bf9f_75b4a90dc12f.slice/crio-c3901279aac015a6e0ba1e0587b0925c4d9be796a08a97eeaad92a010cc476f3 WatchSource:0}: Error finding container c3901279aac015a6e0ba1e0587b0925c4d9be796a08a97eeaad92a010cc476f3: Status 404 returned error can't find the container with id c3901279aac015a6e0ba1e0587b0925c4d9be796a08a97eeaad92a010cc476f3 Dec 01 09:31:53 crc kubenswrapper[4763]: E1201 09:31:53.367201 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 09:31:53 crc kubenswrapper[4763]: E1201 09:31:53.367383 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6z5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-p2mqz_openstack(bc0325bc-7b6a-4593-8d7c-8fd100c2239f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:31:53 crc kubenswrapper[4763]: E1201 09:31:53.369441 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" podUID="bc0325bc-7b6a-4593-8d7c-8fd100c2239f" Dec 01 09:31:53 crc kubenswrapper[4763]: I1201 09:31:53.488604 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f606a872-f73e-4245-bf9f-75b4a90dc12f","Type":"ContainerStarted","Data":"c3901279aac015a6e0ba1e0587b0925c4d9be796a08a97eeaad92a010cc476f3"} Dec 01 09:31:53 crc kubenswrapper[4763]: E1201 09:31:53.491789 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="f17760ee-44e7-4bf5-b9d6-368f9b780426" Dec 01 09:31:53 crc kubenswrapper[4763]: I1201 09:31:53.495020 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26n5d"] Dec 01 09:31:53 crc kubenswrapper[4763]: E1201 09:31:53.656089 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 01 09:31:53 crc kubenswrapper[4763]: E1201 09:31:53.656235 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-69cvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-ccnfp_openstack(0be30398-4337-4667-a5e3-439ebfe1c01c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:31:53 crc kubenswrapper[4763]: E1201 09:31:53.657432 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" podUID="0be30398-4337-4667-a5e3-439ebfe1c01c" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.033025 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-d2z4q"] Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.129296 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.136422 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:54 crc kubenswrapper[4763]: W1201 09:31:54.189096 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac4b5c0f_58b0_4eab_95bc_27bee39cd7a7.slice/crio-f96516685a91bc9d3fdabc2c9ddceb1170b48f63778a6545bc33e8e774a0b732 WatchSource:0}: Error finding container f96516685a91bc9d3fdabc2c9ddceb1170b48f63778a6545bc33e8e774a0b732: Status 404 returned error can't find the container with id f96516685a91bc9d3fdabc2c9ddceb1170b48f63778a6545bc33e8e774a0b732 Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.211923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6z5r\" (UniqueName: \"kubernetes.io/projected/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-kube-api-access-j6z5r\") pod \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\" (UID: \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\") " Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.211973 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-config\") pod \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\" (UID: \"bc0325bc-7b6a-4593-8d7c-8fd100c2239f\") " Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.212069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-dns-svc\") pod \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.212104 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-config\") pod \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.212147 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fxlk\" (UniqueName: \"kubernetes.io/projected/77faddf8-abb3-4619-bffe-6b35b76e4c2c-kube-api-access-9fxlk\") pod \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\" (UID: \"77faddf8-abb3-4619-bffe-6b35b76e4c2c\") " Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.213055 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77faddf8-abb3-4619-bffe-6b35b76e4c2c" (UID: "77faddf8-abb3-4619-bffe-6b35b76e4c2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.213807 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-config" (OuterVolumeSpecName: "config") pod "bc0325bc-7b6a-4593-8d7c-8fd100c2239f" (UID: "bc0325bc-7b6a-4593-8d7c-8fd100c2239f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.214238 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-config" (OuterVolumeSpecName: "config") pod "77faddf8-abb3-4619-bffe-6b35b76e4c2c" (UID: "77faddf8-abb3-4619-bffe-6b35b76e4c2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.214665 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.214686 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.214695 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77faddf8-abb3-4619-bffe-6b35b76e4c2c-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.219121 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77faddf8-abb3-4619-bffe-6b35b76e4c2c-kube-api-access-9fxlk" (OuterVolumeSpecName: "kube-api-access-9fxlk") pod "77faddf8-abb3-4619-bffe-6b35b76e4c2c" (UID: "77faddf8-abb3-4619-bffe-6b35b76e4c2c"). InnerVolumeSpecName "kube-api-access-9fxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.219532 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-kube-api-access-j6z5r" (OuterVolumeSpecName: "kube-api-access-j6z5r") pod "bc0325bc-7b6a-4593-8d7c-8fd100c2239f" (UID: "bc0325bc-7b6a-4593-8d7c-8fd100c2239f"). InnerVolumeSpecName "kube-api-access-j6z5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.316167 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fxlk\" (UniqueName: \"kubernetes.io/projected/77faddf8-abb3-4619-bffe-6b35b76e4c2c-kube-api-access-9fxlk\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.316212 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6z5r\" (UniqueName: \"kubernetes.io/projected/bc0325bc-7b6a-4593-8d7c-8fd100c2239f-kube-api-access-j6z5r\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.496683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" event={"ID":"77faddf8-abb3-4619-bffe-6b35b76e4c2c","Type":"ContainerDied","Data":"09437d938b3669f61586bc626bcbae8bd0d354dd6d0d0d0b3670e73ad66343c9"} Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.496802 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lk6j8" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.503854 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e","Type":"ContainerStarted","Data":"70912e794f5370e29de19917e79522f712c09efec2975d39c42dbd9548ef2bff"} Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.507368 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fc05c35a-b504-4104-a515-737272f6b4d9","Type":"ContainerStarted","Data":"972e19a2c418270a2bb231a9364ff93cc227bb98337adf2625e3b3d0b03d4c11"} Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.510419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d2z4q" event={"ID":"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7","Type":"ContainerStarted","Data":"f96516685a91bc9d3fdabc2c9ddceb1170b48f63778a6545bc33e8e774a0b732"} Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.512836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26n5d" event={"ID":"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5","Type":"ContainerStarted","Data":"f7f9fe7baa4fdff3b2553a42114b02893e6e7bbb5565110b28bf9fa1b5bd64c7"} Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.517500 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56f133f4-8bf0-4c02-add2-37f41b8904cc","Type":"ContainerStarted","Data":"86983aba13148484a71a0d2d268e9d207c4ea276886647d390e4527e620f1a60"} Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.549677 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.550618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p2mqz" event={"ID":"bc0325bc-7b6a-4593-8d7c-8fd100c2239f","Type":"ContainerDied","Data":"0ce430195e47ca974ecb6e191e89d541f6e9f0e474b624ccb2e37637b3aa803d"} Dec 01 09:31:54 crc kubenswrapper[4763]: E1201 09:31:54.554166 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" podUID="0be30398-4337-4667-a5e3-439ebfe1c01c" Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.686889 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.753446 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lk6j8"] Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.760896 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lk6j8"] Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.808668 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p2mqz"] Dec 01 09:31:54 crc kubenswrapper[4763]: I1201 09:31:54.816834 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p2mqz"] Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.004060 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77faddf8-abb3-4619-bffe-6b35b76e4c2c" path="/var/lib/kubelet/pods/77faddf8-abb3-4619-bffe-6b35b76e4c2c/volumes" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.004498 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0325bc-7b6a-4593-8d7c-8fd100c2239f" path="/var/lib/kubelet/pods/bc0325bc-7b6a-4593-8d7c-8fd100c2239f/volumes" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.142042 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 01 09:31:55 crc kubenswrapper[4763]: W1201 09:31:55.227618 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fe31b02_1b96_439f_bc58_9d2d2700d35b.slice/crio-61f6fd12d36d849b99c3a450b73f27f40e73277864b670bc417551e98c688037 WatchSource:0}: Error finding container 61f6fd12d36d849b99c3a450b73f27f40e73277864b670bc417551e98c688037: Status 404 returned error can't find the container with id 61f6fd12d36d849b99c3a450b73f27f40e73277864b670bc417551e98c688037 Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.484662 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4p79n"] Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.485962 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.514629 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.539157 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4p79n"] Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.653287 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/065901e9-140d-472b-8ed7-6e800f992c73-ovn-rundir\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.653341 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065901e9-140d-472b-8ed7-6e800f992c73-combined-ca-bundle\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.653388 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5h87\" (UniqueName: \"kubernetes.io/projected/065901e9-140d-472b-8ed7-6e800f992c73-kube-api-access-c5h87\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.653430 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/065901e9-140d-472b-8ed7-6e800f992c73-ovs-rundir\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.653484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/065901e9-140d-472b-8ed7-6e800f992c73-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.653563 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065901e9-140d-472b-8ed7-6e800f992c73-config\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.670270 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fe31b02-1b96-439f-bc58-9d2d2700d35b","Type":"ContainerStarted","Data":"61f6fd12d36d849b99c3a450b73f27f40e73277864b670bc417551e98c688037"} Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.681111 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53cf9c04-a52d-4827-a700-98ca02183344","Type":"ContainerStarted","Data":"8c1d98881c3bc1622990c364f450de88e45d211ceb7dc05c3517a65a63a82b89"} Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.687277 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2aab79d8-d046-4877-9fa0-12d87132a99f","Type":"ContainerStarted","Data":"92fb3bdffa25fcef16cdececdf127bc77150c956c9a8d4d49a406e8274f68b0b"} Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.689125 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6tlks"] Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.756584 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/065901e9-140d-472b-8ed7-6e800f992c73-ovn-rundir\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.756625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065901e9-140d-472b-8ed7-6e800f992c73-combined-ca-bundle\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.756689 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5h87\" (UniqueName: \"kubernetes.io/projected/065901e9-140d-472b-8ed7-6e800f992c73-kube-api-access-c5h87\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.756754 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/065901e9-140d-472b-8ed7-6e800f992c73-ovs-rundir\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.756827 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/065901e9-140d-472b-8ed7-6e800f992c73-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.756947 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065901e9-140d-472b-8ed7-6e800f992c73-config\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.757847 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/065901e9-140d-472b-8ed7-6e800f992c73-ovn-rundir\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.758202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065901e9-140d-472b-8ed7-6e800f992c73-config\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.758999 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/065901e9-140d-472b-8ed7-6e800f992c73-ovs-rundir\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.761929 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sqh6j"] Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.763247 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.766236 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.773778 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sqh6j"] Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.778707 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/065901e9-140d-472b-8ed7-6e800f992c73-combined-ca-bundle\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.779073 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/065901e9-140d-472b-8ed7-6e800f992c73-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.820051 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5h87\" (UniqueName: \"kubernetes.io/projected/065901e9-140d-472b-8ed7-6e800f992c73-kube-api-access-c5h87\") pod \"ovn-controller-metrics-4p79n\" (UID: \"065901e9-140d-472b-8ed7-6e800f992c73\") " pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.856367 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4p79n" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.961396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.961640 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.961771 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-config\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:55 crc kubenswrapper[4763]: I1201 09:31:55.961906 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh8mp\" (UniqueName: \"kubernetes.io/projected/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-kube-api-access-dh8mp\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.018417 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ccnfp"] Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.065222 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh8mp\" (UniqueName: \"kubernetes.io/projected/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-kube-api-access-dh8mp\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.065308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.065396 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.065447 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-config\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.066448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-config\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.066541 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-9zjzk"] Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.069281 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.070076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.073151 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.077104 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.083099 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-9zjzk"] Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.130331 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh8mp\" (UniqueName: \"kubernetes.io/projected/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-kube-api-access-dh8mp\") pod \"dnsmasq-dns-5bf47b49b7-sqh6j\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.174551 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.174622 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcjzx\" (UniqueName: \"kubernetes.io/projected/cf4df61c-41b4-42c6-bdc7-dca59452a919-kube-api-access-gcjzx\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.174725 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-config\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.174755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.174854 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-dns-svc\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.188835 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.291299 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-config\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.291352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.291428 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-dns-svc\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.291489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.291519 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcjzx\" (UniqueName: \"kubernetes.io/projected/cf4df61c-41b4-42c6-bdc7-dca59452a919-kube-api-access-gcjzx\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.292932 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-config\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.293417 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-dns-svc\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.293558 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.296000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.354805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcjzx\" (UniqueName: \"kubernetes.io/projected/cf4df61c-41b4-42c6-bdc7-dca59452a919-kube-api-access-gcjzx\") pod \"dnsmasq-dns-8554648995-9zjzk\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.438421 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.562318 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.712540 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-dns-svc\") pod \"26ceb679-309a-4cf5-8fe6-3203868faf07\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.712620 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr6t4\" (UniqueName: \"kubernetes.io/projected/26ceb679-309a-4cf5-8fe6-3203868faf07-kube-api-access-tr6t4\") pod \"26ceb679-309a-4cf5-8fe6-3203868faf07\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.712699 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-config\") pod \"26ceb679-309a-4cf5-8fe6-3203868faf07\" (UID: \"26ceb679-309a-4cf5-8fe6-3203868faf07\") " Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.713513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-config" (OuterVolumeSpecName: "config") pod "26ceb679-309a-4cf5-8fe6-3203868faf07" (UID: "26ceb679-309a-4cf5-8fe6-3203868faf07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.713746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26ceb679-309a-4cf5-8fe6-3203868faf07" (UID: "26ceb679-309a-4cf5-8fe6-3203868faf07"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.720404 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ceb679-309a-4cf5-8fe6-3203868faf07-kube-api-access-tr6t4" (OuterVolumeSpecName: "kube-api-access-tr6t4") pod "26ceb679-309a-4cf5-8fe6-3203868faf07" (UID: "26ceb679-309a-4cf5-8fe6-3203868faf07"). InnerVolumeSpecName "kube-api-access-tr6t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.720641 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6tlks" event={"ID":"26ceb679-309a-4cf5-8fe6-3203868faf07","Type":"ContainerDied","Data":"21dee80c2124bb3262d424f4a0ed134e21c7a54af1404dfef7b7e03249a88a54"} Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.720726 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6tlks" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.727237 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.815063 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-dns-svc\") pod \"0be30398-4337-4667-a5e3-439ebfe1c01c\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.815791 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-config\") pod \"0be30398-4337-4667-a5e3-439ebfe1c01c\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.815820 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69cvr\" (UniqueName: \"kubernetes.io/projected/0be30398-4337-4667-a5e3-439ebfe1c01c-kube-api-access-69cvr\") pod \"0be30398-4337-4667-a5e3-439ebfe1c01c\" (UID: \"0be30398-4337-4667-a5e3-439ebfe1c01c\") " Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.816170 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.816183 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr6t4\" (UniqueName: \"kubernetes.io/projected/26ceb679-309a-4cf5-8fe6-3203868faf07-kube-api-access-tr6t4\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.816193 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26ceb679-309a-4cf5-8fe6-3203868faf07-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.818670 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0be30398-4337-4667-a5e3-439ebfe1c01c" (UID: "0be30398-4337-4667-a5e3-439ebfe1c01c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.819321 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-config" (OuterVolumeSpecName: "config") pod "0be30398-4337-4667-a5e3-439ebfe1c01c" (UID: "0be30398-4337-4667-a5e3-439ebfe1c01c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.833301 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be30398-4337-4667-a5e3-439ebfe1c01c-kube-api-access-69cvr" (OuterVolumeSpecName: "kube-api-access-69cvr") pod "0be30398-4337-4667-a5e3-439ebfe1c01c" (UID: "0be30398-4337-4667-a5e3-439ebfe1c01c"). InnerVolumeSpecName "kube-api-access-69cvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.859292 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6tlks"] Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.872657 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6tlks"] Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.917222 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.917306 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0be30398-4337-4667-a5e3-439ebfe1c01c-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:56 crc kubenswrapper[4763]: I1201 09:31:56.917318 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69cvr\" (UniqueName: \"kubernetes.io/projected/0be30398-4337-4667-a5e3-439ebfe1c01c-kube-api-access-69cvr\") on node \"crc\" DevicePath \"\"" Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.009028 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26ceb679-309a-4cf5-8fe6-3203868faf07" path="/var/lib/kubelet/pods/26ceb679-309a-4cf5-8fe6-3203868faf07/volumes" Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.252235 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sqh6j"] Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.435886 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-9zjzk"] Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.448988 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4p79n"] Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.738830 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" event={"ID":"0be30398-4337-4667-a5e3-439ebfe1c01c","Type":"ContainerDied","Data":"b442466e817dc3374361347f5c75db81d6c32e015bb7a08cbda3e7b9c8b91b39"} Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.738988 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ccnfp" Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.746648 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" event={"ID":"dd0b3a07-af35-4daf-a0bd-913b1b421cfd","Type":"ContainerStarted","Data":"6472f2dd7943e6adbdf19cb4703f3cb072db15f2204faff2b88e6460011df950"} Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.811411 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ccnfp"] Dec 01 09:31:57 crc kubenswrapper[4763]: I1201 09:31:57.818925 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ccnfp"] Dec 01 09:31:58 crc kubenswrapper[4763]: W1201 09:31:58.296632 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf4df61c_41b4_42c6_bdc7_dca59452a919.slice/crio-bbc1402016a8f8c3d882b7dd595880044080c141dac87273be9eaa1e583e35a4 WatchSource:0}: Error finding container bbc1402016a8f8c3d882b7dd595880044080c141dac87273be9eaa1e583e35a4: Status 404 returned error can't find the container with id bbc1402016a8f8c3d882b7dd595880044080c141dac87273be9eaa1e583e35a4 Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.754624 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f606a872-f73e-4245-bf9f-75b4a90dc12f","Type":"ContainerStarted","Data":"0b063f6419fc4006e9f47667888a26e42b268010b8e140dbe89949348e770766"} Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.754719 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.756745 4763 generic.go:334] "Generic (PLEG): container finished" podID="fc05c35a-b504-4104-a515-737272f6b4d9" containerID="972e19a2c418270a2bb231a9364ff93cc227bb98337adf2625e3b3d0b03d4c11" exitCode=0 Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.756796 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fc05c35a-b504-4104-a515-737272f6b4d9","Type":"ContainerDied","Data":"972e19a2c418270a2bb231a9364ff93cc227bb98337adf2625e3b3d0b03d4c11"} Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.759534 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4p79n" event={"ID":"065901e9-140d-472b-8ed7-6e800f992c73","Type":"ContainerStarted","Data":"d2b6a689eb3856892a9171f48b3092a8cca3c8388bbc07d445863bbe093b16cd"} Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.762955 4763 generic.go:334] "Generic (PLEG): container finished" podID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" containerID="9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3" exitCode=0 Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.762995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" event={"ID":"dd0b3a07-af35-4daf-a0bd-913b1b421cfd","Type":"ContainerDied","Data":"9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3"} Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.767398 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-9zjzk" event={"ID":"cf4df61c-41b4-42c6-bdc7-dca59452a919","Type":"ContainerStarted","Data":"bbc1402016a8f8c3d882b7dd595880044080c141dac87273be9eaa1e583e35a4"} Dec 01 09:31:58 crc kubenswrapper[4763]: I1201 09:31:58.782911 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=25.730457016 podStartE2EDuration="30.782887136s" podCreationTimestamp="2025-12-01 09:31:28 +0000 UTC" firstStartedPulling="2025-12-01 09:31:53.339601727 +0000 UTC m=+1030.608250495" lastFinishedPulling="2025-12-01 09:31:58.392031847 +0000 UTC m=+1035.660680615" observedRunningTime="2025-12-01 09:31:58.773009844 +0000 UTC m=+1036.041658622" watchObservedRunningTime="2025-12-01 09:31:58.782887136 +0000 UTC m=+1036.051535904" Dec 01 09:31:59 crc kubenswrapper[4763]: I1201 09:31:59.004847 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be30398-4337-4667-a5e3-439ebfe1c01c" path="/var/lib/kubelet/pods/0be30398-4337-4667-a5e3-439ebfe1c01c/volumes" Dec 01 09:31:59 crc kubenswrapper[4763]: I1201 09:31:59.782540 4763 generic.go:334] "Generic (PLEG): container finished" podID="7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e" containerID="70912e794f5370e29de19917e79522f712c09efec2975d39c42dbd9548ef2bff" exitCode=0 Dec 01 09:31:59 crc kubenswrapper[4763]: I1201 09:31:59.782613 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e","Type":"ContainerDied","Data":"70912e794f5370e29de19917e79522f712c09efec2975d39c42dbd9548ef2bff"} Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.799951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" event={"ID":"dd0b3a07-af35-4daf-a0bd-913b1b421cfd","Type":"ContainerStarted","Data":"c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d"} Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.801423 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.804475 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerID="f2073c15df3f1e669da2f08060c43a3fe29bb05313c3ec3ab3b4d5e771986cf7" exitCode=0 Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.804523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-9zjzk" event={"ID":"cf4df61c-41b4-42c6-bdc7-dca59452a919","Type":"ContainerDied","Data":"f2073c15df3f1e669da2f08060c43a3fe29bb05313c3ec3ab3b4d5e771986cf7"} Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.811223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e","Type":"ContainerStarted","Data":"a7c2857dde981868e274854dcbbbd0937cc365994ab27748be415366c3b32c00"} Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.823885 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" podStartSLOduration=5.713537134 podStartE2EDuration="6.823870364s" podCreationTimestamp="2025-12-01 09:31:55 +0000 UTC" firstStartedPulling="2025-12-01 09:31:57.279736242 +0000 UTC m=+1034.548385000" lastFinishedPulling="2025-12-01 09:31:58.390069452 +0000 UTC m=+1035.658718230" observedRunningTime="2025-12-01 09:32:01.820551055 +0000 UTC m=+1039.089199823" watchObservedRunningTime="2025-12-01 09:32:01.823870364 +0000 UTC m=+1039.092519132" Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.827156 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2aab79d8-d046-4877-9fa0-12d87132a99f","Type":"ContainerStarted","Data":"14501409905cb91d0e600fa68888f3a233f8495192f33a340fbe65d3704241c8"} Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.831830 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fc05c35a-b504-4104-a515-737272f6b4d9","Type":"ContainerStarted","Data":"1b9767726395be96bfb619c2e4be8aedcf05159304b99561f526f359187715d6"} Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.839169 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d2z4q" event={"ID":"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7","Type":"ContainerStarted","Data":"9445690ba514db9c4abbbc9fe74f41b1c8525fc5bb712b1f460540ae27b6066d"} Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.849682 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=10.573414502 podStartE2EDuration="36.849660699s" podCreationTimestamp="2025-12-01 09:31:25 +0000 UTC" firstStartedPulling="2025-12-01 09:31:27.051239215 +0000 UTC m=+1004.319887983" lastFinishedPulling="2025-12-01 09:31:53.327485412 +0000 UTC m=+1030.596134180" observedRunningTime="2025-12-01 09:32:01.845675234 +0000 UTC m=+1039.114324002" watchObservedRunningTime="2025-12-01 09:32:01.849660699 +0000 UTC m=+1039.118309457" Dec 01 09:32:01 crc kubenswrapper[4763]: I1201 09:32:01.914049 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=12.171235572 podStartE2EDuration="38.91402961s" podCreationTimestamp="2025-12-01 09:31:23 +0000 UTC" firstStartedPulling="2025-12-01 09:31:26.655816589 +0000 UTC m=+1003.924465357" lastFinishedPulling="2025-12-01 09:31:53.398610627 +0000 UTC m=+1030.667259395" observedRunningTime="2025-12-01 09:32:01.913651697 +0000 UTC m=+1039.182300455" watchObservedRunningTime="2025-12-01 09:32:01.91402961 +0000 UTC m=+1039.182678378" Dec 01 09:32:02 crc kubenswrapper[4763]: I1201 09:32:02.849312 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fe31b02-1b96-439f-bc58-9d2d2700d35b","Type":"ContainerStarted","Data":"0e66924f590bacbecfe66c18aada31f37c56d1ff47b8e597245e8a27da0ffff4"} Dec 01 09:32:02 crc kubenswrapper[4763]: I1201 09:32:02.851850 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-9zjzk" event={"ID":"cf4df61c-41b4-42c6-bdc7-dca59452a919","Type":"ContainerStarted","Data":"9482be36431dba72697ac9e0096d950a6f488ccc4bd68cff51d62414922ca39e"} Dec 01 09:32:02 crc kubenswrapper[4763]: I1201 09:32:02.851994 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:32:02 crc kubenswrapper[4763]: I1201 09:32:02.855892 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26n5d" event={"ID":"68a1c130-7d5e-4679-9ec7-dd63b84cc8d5","Type":"ContainerStarted","Data":"a09fd1254e65f6a36d753f5a4115d52124181c12bc12be10515c9a3add0da971"} Dec 01 09:32:02 crc kubenswrapper[4763]: I1201 09:32:02.856339 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-26n5d" Dec 01 09:32:02 crc kubenswrapper[4763]: I1201 09:32:02.883367 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-9zjzk" podStartSLOduration=3.790571218 podStartE2EDuration="6.883344936s" podCreationTimestamp="2025-12-01 09:31:56 +0000 UTC" firstStartedPulling="2025-12-01 09:31:58.305124015 +0000 UTC m=+1035.573772783" lastFinishedPulling="2025-12-01 09:32:01.397897723 +0000 UTC m=+1038.666546501" observedRunningTime="2025-12-01 09:32:02.878974117 +0000 UTC m=+1040.147622885" watchObservedRunningTime="2025-12-01 09:32:02.883344936 +0000 UTC m=+1040.151993704" Dec 01 09:32:03 crc kubenswrapper[4763]: I1201 09:32:03.033005 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-26n5d" podStartSLOduration=23.336808617 podStartE2EDuration="31.032983209s" podCreationTimestamp="2025-12-01 09:31:32 +0000 UTC" firstStartedPulling="2025-12-01 09:31:53.702166365 +0000 UTC m=+1030.970815133" lastFinishedPulling="2025-12-01 09:32:01.398340957 +0000 UTC m=+1038.666989725" observedRunningTime="2025-12-01 09:32:02.89876327 +0000 UTC m=+1040.167412048" watchObservedRunningTime="2025-12-01 09:32:03.032983209 +0000 UTC m=+1040.301631967" Dec 01 09:32:03 crc kubenswrapper[4763]: I1201 09:32:03.864807 4763 generic.go:334] "Generic (PLEG): container finished" podID="ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7" containerID="9445690ba514db9c4abbbc9fe74f41b1c8525fc5bb712b1f460540ae27b6066d" exitCode=0 Dec 01 09:32:03 crc kubenswrapper[4763]: I1201 09:32:03.866110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d2z4q" event={"ID":"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7","Type":"ContainerDied","Data":"9445690ba514db9c4abbbc9fe74f41b1c8525fc5bb712b1f460540ae27b6066d"} Dec 01 09:32:03 crc kubenswrapper[4763]: I1201 09:32:03.929135 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:32:03 crc kubenswrapper[4763]: I1201 09:32:03.929202 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:32:03 crc kubenswrapper[4763]: I1201 09:32:03.929249 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:32:03 crc kubenswrapper[4763]: I1201 09:32:03.930164 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fbcaa44c81e6e848c09eeb8a68cb5f7f03225b440f52ed6609277022adeaf191"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:32:03 crc kubenswrapper[4763]: I1201 09:32:03.930231 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://fbcaa44c81e6e848c09eeb8a68cb5f7f03225b440f52ed6609277022adeaf191" gracePeriod=600 Dec 01 09:32:04 crc kubenswrapper[4763]: I1201 09:32:04.890324 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="fbcaa44c81e6e848c09eeb8a68cb5f7f03225b440f52ed6609277022adeaf191" exitCode=0 Dec 01 09:32:04 crc kubenswrapper[4763]: I1201 09:32:04.891164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"fbcaa44c81e6e848c09eeb8a68cb5f7f03225b440f52ed6609277022adeaf191"} Dec 01 09:32:04 crc kubenswrapper[4763]: I1201 09:32:04.891231 4763 scope.go:117] "RemoveContainer" containerID="d9e0e5adb882a530747c6596a975101cf0f536a3cb28e48dd137e2024a6a05f6" Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.026448 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.027290 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.901336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2aab79d8-d046-4877-9fa0-12d87132a99f","Type":"ContainerStarted","Data":"5fb77c45ebc6948d0f9166a08e34b06895dfc8594699fdf54fc7eacb31862b84"} Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.907554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"cdb76d67e51814424a96785e6ed38c02e1e5ea6f161d5d45ba5cfcfc9064da51"} Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.914280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f17760ee-44e7-4bf5-b9d6-368f9b780426","Type":"ContainerStarted","Data":"789c896a0d4db1216a45d216ccb2adc6700315fa6e4744db405d39a5b2db59c2"} Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.914556 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.919347 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d2z4q" event={"ID":"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7","Type":"ContainerStarted","Data":"caef403b92086edf59a9e196543ba610bbc13d826cf5644233c62f9c934864a4"} Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.924971 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=21.532310233 podStartE2EDuration="31.924953958s" podCreationTimestamp="2025-12-01 09:31:34 +0000 UTC" firstStartedPulling="2025-12-01 09:31:55.036324211 +0000 UTC m=+1032.304972989" lastFinishedPulling="2025-12-01 09:32:05.428967936 +0000 UTC m=+1042.697616714" observedRunningTime="2025-12-01 09:32:05.921695959 +0000 UTC m=+1043.190344727" watchObservedRunningTime="2025-12-01 09:32:05.924953958 +0000 UTC m=+1043.193602726" Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.933052 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0fe31b02-1b96-439f-bc58-9d2d2700d35b","Type":"ContainerStarted","Data":"5de731786562db80eaaa8e2f3001dcee8f5dc0ee5523e81af8c403bd13e49a5b"} Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.957985 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=6.309320884 podStartE2EDuration="39.957963565s" podCreationTimestamp="2025-12-01 09:31:26 +0000 UTC" firstStartedPulling="2025-12-01 09:31:31.882603726 +0000 UTC m=+1009.151252494" lastFinishedPulling="2025-12-01 09:32:05.531246407 +0000 UTC m=+1042.799895175" observedRunningTime="2025-12-01 09:32:05.953024482 +0000 UTC m=+1043.221673250" watchObservedRunningTime="2025-12-01 09:32:05.957963565 +0000 UTC m=+1043.226612333" Dec 01 09:32:05 crc kubenswrapper[4763]: I1201 09:32:05.977362 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=25.772883789 podStartE2EDuration="35.977337894s" podCreationTimestamp="2025-12-01 09:31:30 +0000 UTC" firstStartedPulling="2025-12-01 09:31:55.231826104 +0000 UTC m=+1032.500474872" lastFinishedPulling="2025-12-01 09:32:05.436280209 +0000 UTC m=+1042.704928977" observedRunningTime="2025-12-01 09:32:05.977103442 +0000 UTC m=+1043.245752210" watchObservedRunningTime="2025-12-01 09:32:05.977337894 +0000 UTC m=+1043.245986662" Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.007163 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4p79n" podStartSLOduration=3.872386867 podStartE2EDuration="11.007145774s" podCreationTimestamp="2025-12-01 09:31:55 +0000 UTC" firstStartedPulling="2025-12-01 09:31:58.299803128 +0000 UTC m=+1035.568451896" lastFinishedPulling="2025-12-01 09:32:05.434562035 +0000 UTC m=+1042.703210803" observedRunningTime="2025-12-01 09:32:05.997316608 +0000 UTC m=+1043.265965366" watchObservedRunningTime="2025-12-01 09:32:06.007145774 +0000 UTC m=+1043.275794542" Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.025512 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.025839 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.095354 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.191246 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.563533 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.563816 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.942220 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-d2z4q" event={"ID":"ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7","Type":"ContainerStarted","Data":"7a28f50eff4bab7dafbed8a25e21e94679c75b6bceb928c90c1e7c510ff70299"} Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.943873 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4p79n" event={"ID":"065901e9-140d-472b-8ed7-6e800f992c73","Type":"ContainerStarted","Data":"8cc0408606029be245757531bb6bbdefb4865722a10e915c2bc092acdd2f27f9"} Dec 01 09:32:06 crc kubenswrapper[4763]: I1201 09:32:06.965112 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-d2z4q" podStartSLOduration=27.848442755 podStartE2EDuration="34.96508696s" podCreationTimestamp="2025-12-01 09:31:32 +0000 UTC" firstStartedPulling="2025-12-01 09:31:54.19198672 +0000 UTC m=+1031.460635488" lastFinishedPulling="2025-12-01 09:32:01.308630925 +0000 UTC m=+1038.577279693" observedRunningTime="2025-12-01 09:32:06.961275648 +0000 UTC m=+1044.229924416" watchObservedRunningTime="2025-12-01 09:32:06.96508696 +0000 UTC m=+1044.233735738" Dec 01 09:32:07 crc kubenswrapper[4763]: I1201 09:32:07.004433 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 01 09:32:07 crc kubenswrapper[4763]: I1201 09:32:07.343589 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 01 09:32:07 crc kubenswrapper[4763]: I1201 09:32:07.446556 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 01 09:32:07 crc kubenswrapper[4763]: I1201 09:32:07.601355 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 01 09:32:07 crc kubenswrapper[4763]: I1201 09:32:07.650710 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:32:07 crc kubenswrapper[4763]: I1201 09:32:07.651161 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:32:08 crc kubenswrapper[4763]: I1201 09:32:08.602077 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 01 09:32:08 crc kubenswrapper[4763]: I1201 09:32:08.640079 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 01 09:32:08 crc kubenswrapper[4763]: I1201 09:32:08.785331 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 01 09:32:08 crc kubenswrapper[4763]: I1201 09:32:08.916989 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 01 09:32:08 crc kubenswrapper[4763]: I1201 09:32:08.980726 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.022355 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.287727 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.289050 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.293903 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.294225 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.296604 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lnm6t" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.296635 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.313667 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.359468 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-config\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.359550 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsk69\" (UniqueName: \"kubernetes.io/projected/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-kube-api-access-dsk69\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.359620 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.359777 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-scripts\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.360015 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.360191 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.360230 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.461489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.461585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.461605 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.461698 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-config\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.461754 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsk69\" (UniqueName: \"kubernetes.io/projected/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-kube-api-access-dsk69\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.461771 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.462797 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-config\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.462862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-scripts\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.462912 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.463534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-scripts\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.471269 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.471573 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.473890 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.486127 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsk69\" (UniqueName: \"kubernetes.io/projected/f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc-kube-api-access-dsk69\") pod \"ovn-northd-0\" (UID: \"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc\") " pod="openstack/ovn-northd-0" Dec 01 09:32:09 crc kubenswrapper[4763]: I1201 09:32:09.609285 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 01 09:32:10 crc kubenswrapper[4763]: I1201 09:32:10.095849 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 01 09:32:10 crc kubenswrapper[4763]: I1201 09:32:10.984523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc","Type":"ContainerStarted","Data":"d042a1788a131eb0ec314e5ec2992b524d3ddb805832d363716bb34e6a8d21b7"} Dec 01 09:32:11 crc kubenswrapper[4763]: I1201 09:32:11.441532 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:32:11 crc kubenswrapper[4763]: I1201 09:32:11.502699 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sqh6j"] Dec 01 09:32:11 crc kubenswrapper[4763]: I1201 09:32:11.502924 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" podUID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" containerName="dnsmasq-dns" containerID="cri-o://c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d" gracePeriod=10 Dec 01 09:32:11 crc kubenswrapper[4763]: I1201 09:32:11.861600 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 01 09:32:11 crc kubenswrapper[4763]: I1201 09:32:11.952135 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.012151 4763 generic.go:334] "Generic (PLEG): container finished" podID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" containerID="c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d" exitCode=0 Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.013490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" event={"ID":"dd0b3a07-af35-4daf-a0bd-913b1b421cfd","Type":"ContainerDied","Data":"c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d"} Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.013537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" event={"ID":"dd0b3a07-af35-4daf-a0bd-913b1b421cfd","Type":"ContainerDied","Data":"6472f2dd7943e6adbdf19cb4703f3cb072db15f2204faff2b88e6460011df950"} Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.013561 4763 scope.go:117] "RemoveContainer" containerID="c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.013709 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sqh6j" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.018706 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-dns-svc\") pod \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.018878 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh8mp\" (UniqueName: \"kubernetes.io/projected/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-kube-api-access-dh8mp\") pod \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.019180 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-ovsdbserver-nb\") pod \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.019304 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-config\") pod \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\" (UID: \"dd0b3a07-af35-4daf-a0bd-913b1b421cfd\") " Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.035891 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc","Type":"ContainerStarted","Data":"98883f86e2d6ade5e7506f07d1a7657d66b27f2fefd637748a3a95fc26a3105e"} Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.035948 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc","Type":"ContainerStarted","Data":"963a2c6328597b7e7024bd917714a3b69f30d069f7960470feca8e3e2d33d4f9"} Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.037199 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.058725 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-kube-api-access-dh8mp" (OuterVolumeSpecName: "kube-api-access-dh8mp") pod "dd0b3a07-af35-4daf-a0bd-913b1b421cfd" (UID: "dd0b3a07-af35-4daf-a0bd-913b1b421cfd"). InnerVolumeSpecName "kube-api-access-dh8mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.059319 4763 scope.go:117] "RemoveContainer" containerID="9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.080405 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-config" (OuterVolumeSpecName: "config") pod "dd0b3a07-af35-4daf-a0bd-913b1b421cfd" (UID: "dd0b3a07-af35-4daf-a0bd-913b1b421cfd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.094055 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.019407847 podStartE2EDuration="3.09403275s" podCreationTimestamp="2025-12-01 09:32:09 +0000 UTC" firstStartedPulling="2025-12-01 09:32:10.09942956 +0000 UTC m=+1047.368078338" lastFinishedPulling="2025-12-01 09:32:11.174054483 +0000 UTC m=+1048.442703241" observedRunningTime="2025-12-01 09:32:12.078982138 +0000 UTC m=+1049.347630906" watchObservedRunningTime="2025-12-01 09:32:12.09403275 +0000 UTC m=+1049.362681518" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.105348 4763 scope.go:117] "RemoveContainer" containerID="c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d" Dec 01 09:32:12 crc kubenswrapper[4763]: E1201 09:32:12.105927 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d\": container with ID starting with c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d not found: ID does not exist" containerID="c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.105963 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d"} err="failed to get container status \"c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d\": rpc error: code = NotFound desc = could not find container \"c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d\": container with ID starting with c08c8eeb3ffccde0a0a879cb86fa6991c52414b4004214b5c492c87e5317984d not found: ID does not exist" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.105990 4763 scope.go:117] "RemoveContainer" containerID="9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3" Dec 01 09:32:12 crc kubenswrapper[4763]: E1201 09:32:12.106551 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3\": container with ID starting with 9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3 not found: ID does not exist" containerID="9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.106593 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3"} err="failed to get container status \"9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3\": rpc error: code = NotFound desc = could not find container \"9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3\": container with ID starting with 9a8f86974559a3c3d98f6c9c8d31690d9822b7d675730898306de14a495ca5d3 not found: ID does not exist" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.106909 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd0b3a07-af35-4daf-a0bd-913b1b421cfd" (UID: "dd0b3a07-af35-4daf-a0bd-913b1b421cfd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.116209 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd0b3a07-af35-4daf-a0bd-913b1b421cfd" (UID: "dd0b3a07-af35-4daf-a0bd-913b1b421cfd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.122151 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.122218 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.122229 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.122238 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh8mp\" (UniqueName: \"kubernetes.io/projected/dd0b3a07-af35-4daf-a0bd-913b1b421cfd-kube-api-access-dh8mp\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.231118 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3641-account-create-update-fzfbs"] Dec 01 09:32:12 crc kubenswrapper[4763]: E1201 09:32:12.231562 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" containerName="init" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.231581 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" containerName="init" Dec 01 09:32:12 crc kubenswrapper[4763]: E1201 09:32:12.231606 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" containerName="dnsmasq-dns" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.231614 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" containerName="dnsmasq-dns" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.231805 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" containerName="dnsmasq-dns" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.232438 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.241356 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.245158 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lfskc"] Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.246411 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lfskc" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.270878 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3641-account-create-update-fzfbs"] Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.328530 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lfskc"] Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.336267 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4zhc\" (UniqueName: \"kubernetes.io/projected/c9aeec68-c0a0-4c88-994c-b229ef8aa223-kube-api-access-x4zhc\") pod \"glance-3641-account-create-update-fzfbs\" (UID: \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\") " pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.336321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmnxd\" (UniqueName: \"kubernetes.io/projected/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-kube-api-access-bmnxd\") pod \"glance-db-create-lfskc\" (UID: \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\") " pod="openstack/glance-db-create-lfskc" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.350171 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-operator-scripts\") pod \"glance-db-create-lfskc\" (UID: \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\") " pod="openstack/glance-db-create-lfskc" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.350658 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9aeec68-c0a0-4c88-994c-b229ef8aa223-operator-scripts\") pod \"glance-3641-account-create-update-fzfbs\" (UID: \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\") " pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.387050 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sqh6j"] Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.393930 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sqh6j"] Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.451923 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-operator-scripts\") pod \"glance-db-create-lfskc\" (UID: \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\") " pod="openstack/glance-db-create-lfskc" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.452033 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9aeec68-c0a0-4c88-994c-b229ef8aa223-operator-scripts\") pod \"glance-3641-account-create-update-fzfbs\" (UID: \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\") " pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.452089 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4zhc\" (UniqueName: \"kubernetes.io/projected/c9aeec68-c0a0-4c88-994c-b229ef8aa223-kube-api-access-x4zhc\") pod \"glance-3641-account-create-update-fzfbs\" (UID: \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\") " pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.452120 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmnxd\" (UniqueName: \"kubernetes.io/projected/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-kube-api-access-bmnxd\") pod \"glance-db-create-lfskc\" (UID: \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\") " pod="openstack/glance-db-create-lfskc" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.452963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-operator-scripts\") pod \"glance-db-create-lfskc\" (UID: \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\") " pod="openstack/glance-db-create-lfskc" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.452977 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9aeec68-c0a0-4c88-994c-b229ef8aa223-operator-scripts\") pod \"glance-3641-account-create-update-fzfbs\" (UID: \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\") " pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.470040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmnxd\" (UniqueName: \"kubernetes.io/projected/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-kube-api-access-bmnxd\") pod \"glance-db-create-lfskc\" (UID: \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\") " pod="openstack/glance-db-create-lfskc" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.470954 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4zhc\" (UniqueName: \"kubernetes.io/projected/c9aeec68-c0a0-4c88-994c-b229ef8aa223-kube-api-access-x4zhc\") pod \"glance-3641-account-create-update-fzfbs\" (UID: \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\") " pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.613791 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:12 crc kubenswrapper[4763]: I1201 09:32:12.639970 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lfskc" Dec 01 09:32:13 crc kubenswrapper[4763]: I1201 09:32:13.036992 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0b3a07-af35-4daf-a0bd-913b1b421cfd" path="/var/lib/kubelet/pods/dd0b3a07-af35-4daf-a0bd-913b1b421cfd/volumes" Dec 01 09:32:13 crc kubenswrapper[4763]: I1201 09:32:13.319250 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lfskc"] Dec 01 09:32:13 crc kubenswrapper[4763]: I1201 09:32:13.496301 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3641-account-create-update-fzfbs"] Dec 01 09:32:13 crc kubenswrapper[4763]: W1201 09:32:13.505799 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9aeec68_c0a0_4c88_994c_b229ef8aa223.slice/crio-d9d7bfebb61844a5d5c2f187c8fe565935bb2cd51f45be9c84880fbab5b5b57e WatchSource:0}: Error finding container d9d7bfebb61844a5d5c2f187c8fe565935bb2cd51f45be9c84880fbab5b5b57e: Status 404 returned error can't find the container with id d9d7bfebb61844a5d5c2f187c8fe565935bb2cd51f45be9c84880fbab5b5b57e Dec 01 09:32:14 crc kubenswrapper[4763]: I1201 09:32:14.081209 4763 generic.go:334] "Generic (PLEG): container finished" podID="30a341bd-d0a9-4c8f-a8e6-3941faa51e05" containerID="f64a184dac5e764ff6478bd366a1de4d336b7ebb88b379420fd2801046f5a0db" exitCode=0 Dec 01 09:32:14 crc kubenswrapper[4763]: I1201 09:32:14.081301 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lfskc" event={"ID":"30a341bd-d0a9-4c8f-a8e6-3941faa51e05","Type":"ContainerDied","Data":"f64a184dac5e764ff6478bd366a1de4d336b7ebb88b379420fd2801046f5a0db"} Dec 01 09:32:14 crc kubenswrapper[4763]: I1201 09:32:14.081335 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lfskc" event={"ID":"30a341bd-d0a9-4c8f-a8e6-3941faa51e05","Type":"ContainerStarted","Data":"851018f7fb705624a12783a788b0da0b709e13596c2e9f6d4571c82b7d75672f"} Dec 01 09:32:14 crc kubenswrapper[4763]: I1201 09:32:14.084558 4763 generic.go:334] "Generic (PLEG): container finished" podID="c9aeec68-c0a0-4c88-994c-b229ef8aa223" containerID="25c0d181959c35a62c6b6ea3f219633715f9c19cf2ae403d0a07c6050b85bed5" exitCode=0 Dec 01 09:32:14 crc kubenswrapper[4763]: I1201 09:32:14.084653 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3641-account-create-update-fzfbs" event={"ID":"c9aeec68-c0a0-4c88-994c-b229ef8aa223","Type":"ContainerDied","Data":"25c0d181959c35a62c6b6ea3f219633715f9c19cf2ae403d0a07c6050b85bed5"} Dec 01 09:32:14 crc kubenswrapper[4763]: I1201 09:32:14.084678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3641-account-create-update-fzfbs" event={"ID":"c9aeec68-c0a0-4c88-994c-b229ef8aa223","Type":"ContainerStarted","Data":"d9d7bfebb61844a5d5c2f187c8fe565935bb2cd51f45be9c84880fbab5b5b57e"} Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.463222 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lfskc" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.471277 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.606920 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zhc\" (UniqueName: \"kubernetes.io/projected/c9aeec68-c0a0-4c88-994c-b229ef8aa223-kube-api-access-x4zhc\") pod \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\" (UID: \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\") " Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.607017 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-operator-scripts\") pod \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\" (UID: \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\") " Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.607084 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9aeec68-c0a0-4c88-994c-b229ef8aa223-operator-scripts\") pod \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\" (UID: \"c9aeec68-c0a0-4c88-994c-b229ef8aa223\") " Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.607113 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmnxd\" (UniqueName: \"kubernetes.io/projected/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-kube-api-access-bmnxd\") pod \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\" (UID: \"30a341bd-d0a9-4c8f-a8e6-3941faa51e05\") " Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.607532 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30a341bd-d0a9-4c8f-a8e6-3941faa51e05" (UID: "30a341bd-d0a9-4c8f-a8e6-3941faa51e05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.607781 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9aeec68-c0a0-4c88-994c-b229ef8aa223-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9aeec68-c0a0-4c88-994c-b229ef8aa223" (UID: "c9aeec68-c0a0-4c88-994c-b229ef8aa223"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.612424 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9aeec68-c0a0-4c88-994c-b229ef8aa223-kube-api-access-x4zhc" (OuterVolumeSpecName: "kube-api-access-x4zhc") pod "c9aeec68-c0a0-4c88-994c-b229ef8aa223" (UID: "c9aeec68-c0a0-4c88-994c-b229ef8aa223"). InnerVolumeSpecName "kube-api-access-x4zhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.612678 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-kube-api-access-bmnxd" (OuterVolumeSpecName: "kube-api-access-bmnxd") pod "30a341bd-d0a9-4c8f-a8e6-3941faa51e05" (UID: "30a341bd-d0a9-4c8f-a8e6-3941faa51e05"). InnerVolumeSpecName "kube-api-access-bmnxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.708613 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.708648 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9aeec68-c0a0-4c88-994c-b229ef8aa223-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.708657 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmnxd\" (UniqueName: \"kubernetes.io/projected/30a341bd-d0a9-4c8f-a8e6-3941faa51e05-kube-api-access-bmnxd\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:15 crc kubenswrapper[4763]: I1201 09:32:15.708668 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zhc\" (UniqueName: \"kubernetes.io/projected/c9aeec68-c0a0-4c88-994c-b229ef8aa223-kube-api-access-x4zhc\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.098798 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lfskc" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.098785 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lfskc" event={"ID":"30a341bd-d0a9-4c8f-a8e6-3941faa51e05","Type":"ContainerDied","Data":"851018f7fb705624a12783a788b0da0b709e13596c2e9f6d4571c82b7d75672f"} Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.098915 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="851018f7fb705624a12783a788b0da0b709e13596c2e9f6d4571c82b7d75672f" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.100804 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3641-account-create-update-fzfbs" event={"ID":"c9aeec68-c0a0-4c88-994c-b229ef8aa223","Type":"ContainerDied","Data":"d9d7bfebb61844a5d5c2f187c8fe565935bb2cd51f45be9c84880fbab5b5b57e"} Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.100831 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9d7bfebb61844a5d5c2f187c8fe565935bb2cd51f45be9c84880fbab5b5b57e" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.100841 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3641-account-create-update-fzfbs" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.375391 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fd4n8"] Dec 01 09:32:16 crc kubenswrapper[4763]: E1201 09:32:16.375983 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a341bd-d0a9-4c8f-a8e6-3941faa51e05" containerName="mariadb-database-create" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.376059 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a341bd-d0a9-4c8f-a8e6-3941faa51e05" containerName="mariadb-database-create" Dec 01 09:32:16 crc kubenswrapper[4763]: E1201 09:32:16.376137 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9aeec68-c0a0-4c88-994c-b229ef8aa223" containerName="mariadb-account-create-update" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.376193 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9aeec68-c0a0-4c88-994c-b229ef8aa223" containerName="mariadb-account-create-update" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.376407 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9aeec68-c0a0-4c88-994c-b229ef8aa223" containerName="mariadb-account-create-update" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.376521 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a341bd-d0a9-4c8f-a8e6-3941faa51e05" containerName="mariadb-database-create" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.377125 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.389113 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fd4n8"] Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.528445 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j282m\" (UniqueName: \"kubernetes.io/projected/69b412ca-2a25-4362-a6f1-67fbc091e68b-kube-api-access-j282m\") pod \"keystone-db-create-fd4n8\" (UID: \"69b412ca-2a25-4362-a6f1-67fbc091e68b\") " pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.528542 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69b412ca-2a25-4362-a6f1-67fbc091e68b-operator-scripts\") pod \"keystone-db-create-fd4n8\" (UID: \"69b412ca-2a25-4362-a6f1-67fbc091e68b\") " pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.587102 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f80b-account-create-update-bmk58"] Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.588388 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.590587 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.606153 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f80b-account-create-update-bmk58"] Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.631537 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j282m\" (UniqueName: \"kubernetes.io/projected/69b412ca-2a25-4362-a6f1-67fbc091e68b-kube-api-access-j282m\") pod \"keystone-db-create-fd4n8\" (UID: \"69b412ca-2a25-4362-a6f1-67fbc091e68b\") " pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.631614 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69b412ca-2a25-4362-a6f1-67fbc091e68b-operator-scripts\") pod \"keystone-db-create-fd4n8\" (UID: \"69b412ca-2a25-4362-a6f1-67fbc091e68b\") " pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.632556 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69b412ca-2a25-4362-a6f1-67fbc091e68b-operator-scripts\") pod \"keystone-db-create-fd4n8\" (UID: \"69b412ca-2a25-4362-a6f1-67fbc091e68b\") " pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.651200 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j282m\" (UniqueName: \"kubernetes.io/projected/69b412ca-2a25-4362-a6f1-67fbc091e68b-kube-api-access-j282m\") pod \"keystone-db-create-fd4n8\" (UID: \"69b412ca-2a25-4362-a6f1-67fbc091e68b\") " pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.692412 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.719651 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xqvdp"] Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.720738 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.728719 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xqvdp"] Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.735918 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxtn7\" (UniqueName: \"kubernetes.io/projected/eb77b336-6911-4e22-888a-0a8e20435893-kube-api-access-bxtn7\") pod \"keystone-f80b-account-create-update-bmk58\" (UID: \"eb77b336-6911-4e22-888a-0a8e20435893\") " pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.736549 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb77b336-6911-4e22-888a-0a8e20435893-operator-scripts\") pod \"keystone-f80b-account-create-update-bmk58\" (UID: \"eb77b336-6911-4e22-888a-0a8e20435893\") " pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.809113 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-35ab-account-create-update-7jbxv"] Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.810299 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.830600 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-35ab-account-create-update-7jbxv"] Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.839090 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb77b336-6911-4e22-888a-0a8e20435893-operator-scripts\") pod \"keystone-f80b-account-create-update-bmk58\" (UID: \"eb77b336-6911-4e22-888a-0a8e20435893\") " pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.839134 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca6055d4-f54b-4ed5-b566-cb9be368fb20-operator-scripts\") pod \"placement-db-create-xqvdp\" (UID: \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\") " pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.839231 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxtn7\" (UniqueName: \"kubernetes.io/projected/eb77b336-6911-4e22-888a-0a8e20435893-kube-api-access-bxtn7\") pod \"keystone-f80b-account-create-update-bmk58\" (UID: \"eb77b336-6911-4e22-888a-0a8e20435893\") " pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.839259 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkck6\" (UniqueName: \"kubernetes.io/projected/ca6055d4-f54b-4ed5-b566-cb9be368fb20-kube-api-access-mkck6\") pod \"placement-db-create-xqvdp\" (UID: \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\") " pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.840042 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb77b336-6911-4e22-888a-0a8e20435893-operator-scripts\") pod \"keystone-f80b-account-create-update-bmk58\" (UID: \"eb77b336-6911-4e22-888a-0a8e20435893\") " pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.849851 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.868505 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxtn7\" (UniqueName: \"kubernetes.io/projected/eb77b336-6911-4e22-888a-0a8e20435893-kube-api-access-bxtn7\") pod \"keystone-f80b-account-create-update-bmk58\" (UID: \"eb77b336-6911-4e22-888a-0a8e20435893\") " pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.907189 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.946727 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkck6\" (UniqueName: \"kubernetes.io/projected/ca6055d4-f54b-4ed5-b566-cb9be368fb20-kube-api-access-mkck6\") pod \"placement-db-create-xqvdp\" (UID: \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\") " pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.946848 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca6055d4-f54b-4ed5-b566-cb9be368fb20-operator-scripts\") pod \"placement-db-create-xqvdp\" (UID: \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\") " pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.946958 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7afea76a-bb71-4571-bb88-c221d4a5448d-operator-scripts\") pod \"placement-35ab-account-create-update-7jbxv\" (UID: \"7afea76a-bb71-4571-bb88-c221d4a5448d\") " pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.947043 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz6xr\" (UniqueName: \"kubernetes.io/projected/7afea76a-bb71-4571-bb88-c221d4a5448d-kube-api-access-tz6xr\") pod \"placement-35ab-account-create-update-7jbxv\" (UID: \"7afea76a-bb71-4571-bb88-c221d4a5448d\") " pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:16 crc kubenswrapper[4763]: I1201 09:32:16.948564 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca6055d4-f54b-4ed5-b566-cb9be368fb20-operator-scripts\") pod \"placement-db-create-xqvdp\" (UID: \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\") " pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.003125 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkck6\" (UniqueName: \"kubernetes.io/projected/ca6055d4-f54b-4ed5-b566-cb9be368fb20-kube-api-access-mkck6\") pod \"placement-db-create-xqvdp\" (UID: \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\") " pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.051514 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6xr\" (UniqueName: \"kubernetes.io/projected/7afea76a-bb71-4571-bb88-c221d4a5448d-kube-api-access-tz6xr\") pod \"placement-35ab-account-create-update-7jbxv\" (UID: \"7afea76a-bb71-4571-bb88-c221d4a5448d\") " pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.051635 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7afea76a-bb71-4571-bb88-c221d4a5448d-operator-scripts\") pod \"placement-35ab-account-create-update-7jbxv\" (UID: \"7afea76a-bb71-4571-bb88-c221d4a5448d\") " pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.052230 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7afea76a-bb71-4571-bb88-c221d4a5448d-operator-scripts\") pod \"placement-35ab-account-create-update-7jbxv\" (UID: \"7afea76a-bb71-4571-bb88-c221d4a5448d\") " pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.109826 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.131018 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz6xr\" (UniqueName: \"kubernetes.io/projected/7afea76a-bb71-4571-bb88-c221d4a5448d-kube-api-access-tz6xr\") pod \"placement-35ab-account-create-update-7jbxv\" (UID: \"7afea76a-bb71-4571-bb88-c221d4a5448d\") " pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.170039 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.359359 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fd4n8"] Dec 01 09:32:17 crc kubenswrapper[4763]: W1201 09:32:17.372764 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69b412ca_2a25_4362_a6f1_67fbc091e68b.slice/crio-bf39719e2bc3cb94d19c3079f8773439c38e79d2b0689c3276357bd8b10ea97d WatchSource:0}: Error finding container bf39719e2bc3cb94d19c3079f8773439c38e79d2b0689c3276357bd8b10ea97d: Status 404 returned error can't find the container with id bf39719e2bc3cb94d19c3079f8773439c38e79d2b0689c3276357bd8b10ea97d Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.462266 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-glb7w"] Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.467425 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.472721 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xlzd8" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.472998 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.510364 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-glb7w"] Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.563094 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-combined-ca-bundle\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.563336 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd2dz\" (UniqueName: \"kubernetes.io/projected/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-kube-api-access-pd2dz\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.563369 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-db-sync-config-data\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.563950 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-config-data\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.668254 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-config-data\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.668700 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-combined-ca-bundle\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.668792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd2dz\" (UniqueName: \"kubernetes.io/projected/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-kube-api-access-pd2dz\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.668822 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-db-sync-config-data\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.680022 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-config-data\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.680184 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-combined-ca-bundle\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.687760 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd2dz\" (UniqueName: \"kubernetes.io/projected/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-kube-api-access-pd2dz\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.689807 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-db-sync-config-data\") pod \"glance-db-sync-glb7w\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.793393 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f80b-account-create-update-bmk58"] Dec 01 09:32:17 crc kubenswrapper[4763]: W1201 09:32:17.794904 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb77b336_6911_4e22_888a_0a8e20435893.slice/crio-9a5270fb30f3474190bf4e8b21833b1a9536834c2041b862c4364397abb814c2 WatchSource:0}: Error finding container 9a5270fb30f3474190bf4e8b21833b1a9536834c2041b862c4364397abb814c2: Status 404 returned error can't find the container with id 9a5270fb30f3474190bf4e8b21833b1a9536834c2041b862c4364397abb814c2 Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.800144 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-glb7w" Dec 01 09:32:17 crc kubenswrapper[4763]: I1201 09:32:17.905996 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xqvdp"] Dec 01 09:32:17 crc kubenswrapper[4763]: W1201 09:32:17.917527 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca6055d4_f54b_4ed5_b566_cb9be368fb20.slice/crio-b85ad0ec25020207e94bb75d26261731b810c37c23f883130c3f896af2550a72 WatchSource:0}: Error finding container b85ad0ec25020207e94bb75d26261731b810c37c23f883130c3f896af2550a72: Status 404 returned error can't find the container with id b85ad0ec25020207e94bb75d26261731b810c37c23f883130c3f896af2550a72 Dec 01 09:32:18 crc kubenswrapper[4763]: I1201 09:32:18.058639 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-35ab-account-create-update-7jbxv"] Dec 01 09:32:18 crc kubenswrapper[4763]: W1201 09:32:18.071859 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7afea76a_bb71_4571_bb88_c221d4a5448d.slice/crio-6089fcc1a832cb234530d62dd7b3ffe0d2297672d19c80260ad663c9bdf58fb9 WatchSource:0}: Error finding container 6089fcc1a832cb234530d62dd7b3ffe0d2297672d19c80260ad663c9bdf58fb9: Status 404 returned error can't find the container with id 6089fcc1a832cb234530d62dd7b3ffe0d2297672d19c80260ad663c9bdf58fb9 Dec 01 09:32:18 crc kubenswrapper[4763]: I1201 09:32:18.164598 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xqvdp" event={"ID":"ca6055d4-f54b-4ed5-b566-cb9be368fb20","Type":"ContainerStarted","Data":"b85ad0ec25020207e94bb75d26261731b810c37c23f883130c3f896af2550a72"} Dec 01 09:32:18 crc kubenswrapper[4763]: I1201 09:32:18.166234 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fd4n8" event={"ID":"69b412ca-2a25-4362-a6f1-67fbc091e68b","Type":"ContainerStarted","Data":"bf39719e2bc3cb94d19c3079f8773439c38e79d2b0689c3276357bd8b10ea97d"} Dec 01 09:32:18 crc kubenswrapper[4763]: I1201 09:32:18.167243 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f80b-account-create-update-bmk58" event={"ID":"eb77b336-6911-4e22-888a-0a8e20435893","Type":"ContainerStarted","Data":"9a5270fb30f3474190bf4e8b21833b1a9536834c2041b862c4364397abb814c2"} Dec 01 09:32:18 crc kubenswrapper[4763]: I1201 09:32:18.168078 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35ab-account-create-update-7jbxv" event={"ID":"7afea76a-bb71-4571-bb88-c221d4a5448d","Type":"ContainerStarted","Data":"6089fcc1a832cb234530d62dd7b3ffe0d2297672d19c80260ad663c9bdf58fb9"} Dec 01 09:32:18 crc kubenswrapper[4763]: I1201 09:32:18.400684 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-glb7w"] Dec 01 09:32:18 crc kubenswrapper[4763]: W1201 09:32:18.406833 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ffbeb86_fa1d_4382_b9eb_a51bea87c540.slice/crio-43c759c12ca071bce95929427b8a5f6d9b5b53428915da9edacfedc2ea96237a WatchSource:0}: Error finding container 43c759c12ca071bce95929427b8a5f6d9b5b53428915da9edacfedc2ea96237a: Status 404 returned error can't find the container with id 43c759c12ca071bce95929427b8a5f6d9b5b53428915da9edacfedc2ea96237a Dec 01 09:32:19 crc kubenswrapper[4763]: I1201 09:32:19.182639 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-glb7w" event={"ID":"5ffbeb86-fa1d-4382-b9eb-a51bea87c540","Type":"ContainerStarted","Data":"43c759c12ca071bce95929427b8a5f6d9b5b53428915da9edacfedc2ea96237a"} Dec 01 09:32:21 crc kubenswrapper[4763]: I1201 09:32:21.196237 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35ab-account-create-update-7jbxv" event={"ID":"7afea76a-bb71-4571-bb88-c221d4a5448d","Type":"ContainerStarted","Data":"a562aea9491cc2cba051117385028e9a429d54c7b13fa276d7b918da2580ad4d"} Dec 01 09:32:21 crc kubenswrapper[4763]: I1201 09:32:21.199115 4763 generic.go:334] "Generic (PLEG): container finished" podID="ca6055d4-f54b-4ed5-b566-cb9be368fb20" containerID="fede3a5effb497c1ab7d6e74f5b866e2c8f3dc2a2652c587948508fa342c9ff5" exitCode=0 Dec 01 09:32:21 crc kubenswrapper[4763]: I1201 09:32:21.199175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xqvdp" event={"ID":"ca6055d4-f54b-4ed5-b566-cb9be368fb20","Type":"ContainerDied","Data":"fede3a5effb497c1ab7d6e74f5b866e2c8f3dc2a2652c587948508fa342c9ff5"} Dec 01 09:32:21 crc kubenswrapper[4763]: I1201 09:32:21.201067 4763 generic.go:334] "Generic (PLEG): container finished" podID="69b412ca-2a25-4362-a6f1-67fbc091e68b" containerID="26bc2ff8669dc59d6ea2499d5bc9ba97f352c002d3b43660fa6ef1c3ee31da86" exitCode=0 Dec 01 09:32:21 crc kubenswrapper[4763]: I1201 09:32:21.201124 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fd4n8" event={"ID":"69b412ca-2a25-4362-a6f1-67fbc091e68b","Type":"ContainerDied","Data":"26bc2ff8669dc59d6ea2499d5bc9ba97f352c002d3b43660fa6ef1c3ee31da86"} Dec 01 09:32:21 crc kubenswrapper[4763]: I1201 09:32:21.203916 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f80b-account-create-update-bmk58" event={"ID":"eb77b336-6911-4e22-888a-0a8e20435893","Type":"ContainerStarted","Data":"8d079fe5b541b44d6fb1f052c2687cb9a7becb442ce8bbc108e5efb3983f1ab7"} Dec 01 09:32:21 crc kubenswrapper[4763]: I1201 09:32:21.225750 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-35ab-account-create-update-7jbxv" podStartSLOduration=5.225731996 podStartE2EDuration="5.225731996s" podCreationTimestamp="2025-12-01 09:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:21.221510128 +0000 UTC m=+1058.490158896" watchObservedRunningTime="2025-12-01 09:32:21.225731996 +0000 UTC m=+1058.494380764" Dec 01 09:32:21 crc kubenswrapper[4763]: I1201 09:32:21.280502 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f80b-account-create-update-bmk58" podStartSLOduration=5.280483572 podStartE2EDuration="5.280483572s" podCreationTimestamp="2025-12-01 09:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:21.274176408 +0000 UTC m=+1058.542825176" watchObservedRunningTime="2025-12-01 09:32:21.280483572 +0000 UTC m=+1058.549132340" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.216301 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb77b336-6911-4e22-888a-0a8e20435893" containerID="8d079fe5b541b44d6fb1f052c2687cb9a7becb442ce8bbc108e5efb3983f1ab7" exitCode=0 Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.216571 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f80b-account-create-update-bmk58" event={"ID":"eb77b336-6911-4e22-888a-0a8e20435893","Type":"ContainerDied","Data":"8d079fe5b541b44d6fb1f052c2687cb9a7becb442ce8bbc108e5efb3983f1ab7"} Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.222933 4763 generic.go:334] "Generic (PLEG): container finished" podID="7afea76a-bb71-4571-bb88-c221d4a5448d" containerID="a562aea9491cc2cba051117385028e9a429d54c7b13fa276d7b918da2580ad4d" exitCode=0 Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.223240 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35ab-account-create-update-7jbxv" event={"ID":"7afea76a-bb71-4571-bb88-c221d4a5448d","Type":"ContainerDied","Data":"a562aea9491cc2cba051117385028e9a429d54c7b13fa276d7b918da2580ad4d"} Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.773178 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.780425 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.797733 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkck6\" (UniqueName: \"kubernetes.io/projected/ca6055d4-f54b-4ed5-b566-cb9be368fb20-kube-api-access-mkck6\") pod \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\" (UID: \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\") " Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.797774 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69b412ca-2a25-4362-a6f1-67fbc091e68b-operator-scripts\") pod \"69b412ca-2a25-4362-a6f1-67fbc091e68b\" (UID: \"69b412ca-2a25-4362-a6f1-67fbc091e68b\") " Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.797812 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca6055d4-f54b-4ed5-b566-cb9be368fb20-operator-scripts\") pod \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\" (UID: \"ca6055d4-f54b-4ed5-b566-cb9be368fb20\") " Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.797854 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j282m\" (UniqueName: \"kubernetes.io/projected/69b412ca-2a25-4362-a6f1-67fbc091e68b-kube-api-access-j282m\") pod \"69b412ca-2a25-4362-a6f1-67fbc091e68b\" (UID: \"69b412ca-2a25-4362-a6f1-67fbc091e68b\") " Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.798940 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b412ca-2a25-4362-a6f1-67fbc091e68b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69b412ca-2a25-4362-a6f1-67fbc091e68b" (UID: "69b412ca-2a25-4362-a6f1-67fbc091e68b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.799615 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca6055d4-f54b-4ed5-b566-cb9be368fb20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca6055d4-f54b-4ed5-b566-cb9be368fb20" (UID: "ca6055d4-f54b-4ed5-b566-cb9be368fb20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.808751 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69b412ca-2a25-4362-a6f1-67fbc091e68b-kube-api-access-j282m" (OuterVolumeSpecName: "kube-api-access-j282m") pod "69b412ca-2a25-4362-a6f1-67fbc091e68b" (UID: "69b412ca-2a25-4362-a6f1-67fbc091e68b"). InnerVolumeSpecName "kube-api-access-j282m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.809708 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6055d4-f54b-4ed5-b566-cb9be368fb20-kube-api-access-mkck6" (OuterVolumeSpecName: "kube-api-access-mkck6") pod "ca6055d4-f54b-4ed5-b566-cb9be368fb20" (UID: "ca6055d4-f54b-4ed5-b566-cb9be368fb20"). InnerVolumeSpecName "kube-api-access-mkck6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.899349 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkck6\" (UniqueName: \"kubernetes.io/projected/ca6055d4-f54b-4ed5-b566-cb9be368fb20-kube-api-access-mkck6\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.899381 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69b412ca-2a25-4362-a6f1-67fbc091e68b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.899390 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca6055d4-f54b-4ed5-b566-cb9be368fb20-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:22 crc kubenswrapper[4763]: I1201 09:32:22.899399 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j282m\" (UniqueName: \"kubernetes.io/projected/69b412ca-2a25-4362-a6f1-67fbc091e68b-kube-api-access-j282m\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.232887 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fd4n8" event={"ID":"69b412ca-2a25-4362-a6f1-67fbc091e68b","Type":"ContainerDied","Data":"bf39719e2bc3cb94d19c3079f8773439c38e79d2b0689c3276357bd8b10ea97d"} Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.232923 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fd4n8" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.233005 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf39719e2bc3cb94d19c3079f8773439c38e79d2b0689c3276357bd8b10ea97d" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.236078 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xqvdp" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.236566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xqvdp" event={"ID":"ca6055d4-f54b-4ed5-b566-cb9be368fb20","Type":"ContainerDied","Data":"b85ad0ec25020207e94bb75d26261731b810c37c23f883130c3f896af2550a72"} Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.236608 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b85ad0ec25020207e94bb75d26261731b810c37c23f883130c3f896af2550a72" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.592921 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.599292 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.620815 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz6xr\" (UniqueName: \"kubernetes.io/projected/7afea76a-bb71-4571-bb88-c221d4a5448d-kube-api-access-tz6xr\") pod \"7afea76a-bb71-4571-bb88-c221d4a5448d\" (UID: \"7afea76a-bb71-4571-bb88-c221d4a5448d\") " Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.620874 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7afea76a-bb71-4571-bb88-c221d4a5448d-operator-scripts\") pod \"7afea76a-bb71-4571-bb88-c221d4a5448d\" (UID: \"7afea76a-bb71-4571-bb88-c221d4a5448d\") " Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.623328 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7afea76a-bb71-4571-bb88-c221d4a5448d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7afea76a-bb71-4571-bb88-c221d4a5448d" (UID: "7afea76a-bb71-4571-bb88-c221d4a5448d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.641173 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afea76a-bb71-4571-bb88-c221d4a5448d-kube-api-access-tz6xr" (OuterVolumeSpecName: "kube-api-access-tz6xr") pod "7afea76a-bb71-4571-bb88-c221d4a5448d" (UID: "7afea76a-bb71-4571-bb88-c221d4a5448d"). InnerVolumeSpecName "kube-api-access-tz6xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.722019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxtn7\" (UniqueName: \"kubernetes.io/projected/eb77b336-6911-4e22-888a-0a8e20435893-kube-api-access-bxtn7\") pod \"eb77b336-6911-4e22-888a-0a8e20435893\" (UID: \"eb77b336-6911-4e22-888a-0a8e20435893\") " Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.722117 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb77b336-6911-4e22-888a-0a8e20435893-operator-scripts\") pod \"eb77b336-6911-4e22-888a-0a8e20435893\" (UID: \"eb77b336-6911-4e22-888a-0a8e20435893\") " Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.722564 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz6xr\" (UniqueName: \"kubernetes.io/projected/7afea76a-bb71-4571-bb88-c221d4a5448d-kube-api-access-tz6xr\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.722584 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7afea76a-bb71-4571-bb88-c221d4a5448d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.722944 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb77b336-6911-4e22-888a-0a8e20435893-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb77b336-6911-4e22-888a-0a8e20435893" (UID: "eb77b336-6911-4e22-888a-0a8e20435893"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.725814 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb77b336-6911-4e22-888a-0a8e20435893-kube-api-access-bxtn7" (OuterVolumeSpecName: "kube-api-access-bxtn7") pod "eb77b336-6911-4e22-888a-0a8e20435893" (UID: "eb77b336-6911-4e22-888a-0a8e20435893"). InnerVolumeSpecName "kube-api-access-bxtn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.823813 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxtn7\" (UniqueName: \"kubernetes.io/projected/eb77b336-6911-4e22-888a-0a8e20435893-kube-api-access-bxtn7\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:23 crc kubenswrapper[4763]: I1201 09:32:23.823865 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb77b336-6911-4e22-888a-0a8e20435893-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:24 crc kubenswrapper[4763]: I1201 09:32:24.244482 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f80b-account-create-update-bmk58" event={"ID":"eb77b336-6911-4e22-888a-0a8e20435893","Type":"ContainerDied","Data":"9a5270fb30f3474190bf4e8b21833b1a9536834c2041b862c4364397abb814c2"} Dec 01 09:32:24 crc kubenswrapper[4763]: I1201 09:32:24.244852 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a5270fb30f3474190bf4e8b21833b1a9536834c2041b862c4364397abb814c2" Dec 01 09:32:24 crc kubenswrapper[4763]: I1201 09:32:24.244759 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f80b-account-create-update-bmk58" Dec 01 09:32:24 crc kubenswrapper[4763]: I1201 09:32:24.262351 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35ab-account-create-update-7jbxv" event={"ID":"7afea76a-bb71-4571-bb88-c221d4a5448d","Type":"ContainerDied","Data":"6089fcc1a832cb234530d62dd7b3ffe0d2297672d19c80260ad663c9bdf58fb9"} Dec 01 09:32:24 crc kubenswrapper[4763]: I1201 09:32:24.262417 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6089fcc1a832cb234530d62dd7b3ffe0d2297672d19c80260ad663c9bdf58fb9" Dec 01 09:32:24 crc kubenswrapper[4763]: I1201 09:32:24.262497 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35ab-account-create-update-7jbxv" Dec 01 09:32:24 crc kubenswrapper[4763]: I1201 09:32:24.736098 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 01 09:32:27 crc kubenswrapper[4763]: I1201 09:32:27.286287 4763 generic.go:334] "Generic (PLEG): container finished" podID="56f133f4-8bf0-4c02-add2-37f41b8904cc" containerID="86983aba13148484a71a0d2d268e9d207c4ea276886647d390e4527e620f1a60" exitCode=0 Dec 01 09:32:27 crc kubenswrapper[4763]: I1201 09:32:27.286319 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56f133f4-8bf0-4c02-add2-37f41b8904cc","Type":"ContainerDied","Data":"86983aba13148484a71a0d2d268e9d207c4ea276886647d390e4527e620f1a60"} Dec 01 09:32:27 crc kubenswrapper[4763]: I1201 09:32:27.289755 4763 generic.go:334] "Generic (PLEG): container finished" podID="53cf9c04-a52d-4827-a700-98ca02183344" containerID="8c1d98881c3bc1622990c364f450de88e45d211ceb7dc05c3517a65a63a82b89" exitCode=0 Dec 01 09:32:27 crc kubenswrapper[4763]: I1201 09:32:27.289903 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53cf9c04-a52d-4827-a700-98ca02183344","Type":"ContainerDied","Data":"8c1d98881c3bc1622990c364f450de88e45d211ceb7dc05c3517a65a63a82b89"} Dec 01 09:32:32 crc kubenswrapper[4763]: I1201 09:32:32.473283 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-26n5d" podUID="68a1c130-7d5e-4679-9ec7-dd63b84cc8d5" containerName="ovn-controller" probeResult="failure" output=< Dec 01 09:32:32 crc kubenswrapper[4763]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 09:32:32 crc kubenswrapper[4763]: > Dec 01 09:32:37 crc kubenswrapper[4763]: E1201 09:32:37.286161 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 01 09:32:37 crc kubenswrapper[4763]: E1201 09:32:37.287003 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd2dz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-glb7w_openstack(5ffbeb86-fa1d-4382-b9eb-a51bea87c540): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:32:37 crc kubenswrapper[4763]: E1201 09:32:37.289113 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-glb7w" podUID="5ffbeb86-fa1d-4382-b9eb-a51bea87c540" Dec 01 09:32:37 crc kubenswrapper[4763]: E1201 09:32:37.372877 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-glb7w" podUID="5ffbeb86-fa1d-4382-b9eb-a51bea87c540" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.464577 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-26n5d" podUID="68a1c130-7d5e-4679-9ec7-dd63b84cc8d5" containerName="ovn-controller" probeResult="failure" output=< Dec 01 09:32:37 crc kubenswrapper[4763]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 01 09:32:37 crc kubenswrapper[4763]: > Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.707057 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.725190 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-d2z4q" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.936336 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-26n5d-config-62ww7"] Dec 01 09:32:37 crc kubenswrapper[4763]: E1201 09:32:37.937378 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6055d4-f54b-4ed5-b566-cb9be368fb20" containerName="mariadb-database-create" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.937531 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6055d4-f54b-4ed5-b566-cb9be368fb20" containerName="mariadb-database-create" Dec 01 09:32:37 crc kubenswrapper[4763]: E1201 09:32:37.937630 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69b412ca-2a25-4362-a6f1-67fbc091e68b" containerName="mariadb-database-create" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.937707 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b412ca-2a25-4362-a6f1-67fbc091e68b" containerName="mariadb-database-create" Dec 01 09:32:37 crc kubenswrapper[4763]: E1201 09:32:37.937793 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb77b336-6911-4e22-888a-0a8e20435893" containerName="mariadb-account-create-update" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.937868 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb77b336-6911-4e22-888a-0a8e20435893" containerName="mariadb-account-create-update" Dec 01 09:32:37 crc kubenswrapper[4763]: E1201 09:32:37.937963 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afea76a-bb71-4571-bb88-c221d4a5448d" containerName="mariadb-account-create-update" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.938070 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afea76a-bb71-4571-bb88-c221d4a5448d" containerName="mariadb-account-create-update" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.938373 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb77b336-6911-4e22-888a-0a8e20435893" containerName="mariadb-account-create-update" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.938481 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afea76a-bb71-4571-bb88-c221d4a5448d" containerName="mariadb-account-create-update" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.938570 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6055d4-f54b-4ed5-b566-cb9be368fb20" containerName="mariadb-database-create" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.938634 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="69b412ca-2a25-4362-a6f1-67fbc091e68b" containerName="mariadb-database-create" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.939209 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.943771 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.953287 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26n5d-config-62ww7"] Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.999799 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-log-ovn\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.999851 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-scripts\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.999873 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9jkz\" (UniqueName: \"kubernetes.io/projected/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-kube-api-access-g9jkz\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:37 crc kubenswrapper[4763]: I1201 09:32:37.999906 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-additional-scripts\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.000017 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.000082 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run-ovn\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.101371 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run-ovn\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.101527 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-log-ovn\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.101548 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-scripts\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.101564 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9jkz\" (UniqueName: \"kubernetes.io/projected/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-kube-api-access-g9jkz\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.101587 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-additional-scripts\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.101606 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.101914 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.101967 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run-ovn\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.103036 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-additional-scripts\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.103534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-log-ovn\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.103988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-scripts\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.122236 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9jkz\" (UniqueName: \"kubernetes.io/projected/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-kube-api-access-g9jkz\") pod \"ovn-controller-26n5d-config-62ww7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.256153 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.399952 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53cf9c04-a52d-4827-a700-98ca02183344","Type":"ContainerStarted","Data":"6421caf3cc59da633677f6a1a23a7dff3b63e33d2cf6cf34af4e51c833217b8c"} Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.401204 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.405970 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56f133f4-8bf0-4c02-add2-37f41b8904cc","Type":"ContainerStarted","Data":"87effba6752f4348dba8ec59ec9227f854967d19a27416c23ce446ec22ce32cc"} Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.443054 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=47.405418633 podStartE2EDuration="1m16.443029482s" podCreationTimestamp="2025-12-01 09:31:22 +0000 UTC" firstStartedPulling="2025-12-01 09:31:24.324346386 +0000 UTC m=+1001.592995154" lastFinishedPulling="2025-12-01 09:31:53.361957235 +0000 UTC m=+1030.630606003" observedRunningTime="2025-12-01 09:32:38.427123734 +0000 UTC m=+1075.695772502" watchObservedRunningTime="2025-12-01 09:32:38.443029482 +0000 UTC m=+1075.711678250" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.484022 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.671809108 podStartE2EDuration="1m17.484001129s" podCreationTimestamp="2025-12-01 09:31:21 +0000 UTC" firstStartedPulling="2025-12-01 09:31:24.014993687 +0000 UTC m=+1001.283642455" lastFinishedPulling="2025-12-01 09:31:52.827185708 +0000 UTC m=+1030.095834476" observedRunningTime="2025-12-01 09:32:38.471784373 +0000 UTC m=+1075.740433161" watchObservedRunningTime="2025-12-01 09:32:38.484001129 +0000 UTC m=+1075.752649897" Dec 01 09:32:38 crc kubenswrapper[4763]: I1201 09:32:38.793109 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-26n5d-config-62ww7"] Dec 01 09:32:38 crc kubenswrapper[4763]: W1201 09:32:38.797293 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b7be6f3_36d8_48a2_89d5_1a828fbc87f7.slice/crio-d37a49ab3e2fe37dcf9c4973a844206229215dc2775d6323d044dab857baf7db WatchSource:0}: Error finding container d37a49ab3e2fe37dcf9c4973a844206229215dc2775d6323d044dab857baf7db: Status 404 returned error can't find the container with id d37a49ab3e2fe37dcf9c4973a844206229215dc2775d6323d044dab857baf7db Dec 01 09:32:39 crc kubenswrapper[4763]: I1201 09:32:39.416359 4763 generic.go:334] "Generic (PLEG): container finished" podID="8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" containerID="fe5fd0e435360c1c54f9175a2bcb7e7217baa511ea5b558d42a75326d1a62c2e" exitCode=0 Dec 01 09:32:39 crc kubenswrapper[4763]: I1201 09:32:39.416440 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26n5d-config-62ww7" event={"ID":"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7","Type":"ContainerDied","Data":"fe5fd0e435360c1c54f9175a2bcb7e7217baa511ea5b558d42a75326d1a62c2e"} Dec 01 09:32:39 crc kubenswrapper[4763]: I1201 09:32:39.416865 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26n5d-config-62ww7" event={"ID":"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7","Type":"ContainerStarted","Data":"d37a49ab3e2fe37dcf9c4973a844206229215dc2775d6323d044dab857baf7db"} Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.712670 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.845072 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-additional-scripts\") pod \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.845668 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9jkz\" (UniqueName: \"kubernetes.io/projected/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-kube-api-access-g9jkz\") pod \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.845701 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run\") pod \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.845736 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-log-ovn\") pod \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.845769 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-scripts\") pod \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.845819 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run-ovn\") pod \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\" (UID: \"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7\") " Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.845851 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" (UID: "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.845909 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run" (OuterVolumeSpecName: "var-run") pod "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" (UID: "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.846221 4763 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.846244 4763 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.846295 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" (UID: "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.846329 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" (UID: "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.847322 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-scripts" (OuterVolumeSpecName: "scripts") pod "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" (UID: "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.850729 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-kube-api-access-g9jkz" (OuterVolumeSpecName: "kube-api-access-g9jkz") pod "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" (UID: "8b7be6f3-36d8-48a2-89d5-1a828fbc87f7"). InnerVolumeSpecName "kube-api-access-g9jkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.947361 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9jkz\" (UniqueName: \"kubernetes.io/projected/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-kube-api-access-g9jkz\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.947397 4763 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.947411 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:40 crc kubenswrapper[4763]: I1201 09:32:40.947423 4763 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 01 09:32:41 crc kubenswrapper[4763]: I1201 09:32:41.435322 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-26n5d-config-62ww7" event={"ID":"8b7be6f3-36d8-48a2-89d5-1a828fbc87f7","Type":"ContainerDied","Data":"d37a49ab3e2fe37dcf9c4973a844206229215dc2775d6323d044dab857baf7db"} Dec 01 09:32:41 crc kubenswrapper[4763]: I1201 09:32:41.435409 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d37a49ab3e2fe37dcf9c4973a844206229215dc2775d6323d044dab857baf7db" Dec 01 09:32:41 crc kubenswrapper[4763]: I1201 09:32:41.435493 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-26n5d-config-62ww7" Dec 01 09:32:41 crc kubenswrapper[4763]: I1201 09:32:41.811860 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-26n5d-config-62ww7"] Dec 01 09:32:41 crc kubenswrapper[4763]: I1201 09:32:41.819428 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-26n5d-config-62ww7"] Dec 01 09:32:42 crc kubenswrapper[4763]: I1201 09:32:42.487339 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-26n5d" Dec 01 09:32:43 crc kubenswrapper[4763]: I1201 09:32:43.004692 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" path="/var/lib/kubelet/pods/8b7be6f3-36d8-48a2-89d5-1a828fbc87f7/volumes" Dec 01 09:32:43 crc kubenswrapper[4763]: I1201 09:32:43.258600 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.255778 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.541931 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-glb7w" event={"ID":"5ffbeb86-fa1d-4382-b9eb-a51bea87c540","Type":"ContainerStarted","Data":"23c13ea7c64fcd259467f9887f5be246b537ba8010d5e7e0911f6f48251109d7"} Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.565844 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-42dt5"] Dec 01 09:32:53 crc kubenswrapper[4763]: E1201 09:32:53.566393 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" containerName="ovn-config" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.566488 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" containerName="ovn-config" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.566692 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7be6f3-36d8-48a2-89d5-1a828fbc87f7" containerName="ovn-config" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.575912 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-42dt5" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.577683 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-glb7w" podStartSLOduration=2.819004138 podStartE2EDuration="36.577667332s" podCreationTimestamp="2025-12-01 09:32:17 +0000 UTC" firstStartedPulling="2025-12-01 09:32:18.408188786 +0000 UTC m=+1055.676837554" lastFinishedPulling="2025-12-01 09:32:52.16685198 +0000 UTC m=+1089.435500748" observedRunningTime="2025-12-01 09:32:53.576988116 +0000 UTC m=+1090.845636884" watchObservedRunningTime="2025-12-01 09:32:53.577667332 +0000 UTC m=+1090.846316100" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.603864 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-42dt5"] Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.648301 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-operator-scripts\") pod \"barbican-db-create-42dt5\" (UID: \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\") " pod="openstack/barbican-db-create-42dt5" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.648427 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9dw\" (UniqueName: \"kubernetes.io/projected/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-kube-api-access-8v9dw\") pod \"barbican-db-create-42dt5\" (UID: \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\") " pod="openstack/barbican-db-create-42dt5" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.706779 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.750951 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-operator-scripts\") pod \"barbican-db-create-42dt5\" (UID: \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\") " pod="openstack/barbican-db-create-42dt5" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.751056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9dw\" (UniqueName: \"kubernetes.io/projected/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-kube-api-access-8v9dw\") pod \"barbican-db-create-42dt5\" (UID: \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\") " pod="openstack/barbican-db-create-42dt5" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.752172 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-operator-scripts\") pod \"barbican-db-create-42dt5\" (UID: \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\") " pod="openstack/barbican-db-create-42dt5" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.812255 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-abc0-account-create-update-74tjb"] Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.814551 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.824208 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9dw\" (UniqueName: \"kubernetes.io/projected/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-kube-api-access-8v9dw\") pod \"barbican-db-create-42dt5\" (UID: \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\") " pod="openstack/barbican-db-create-42dt5" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.838563 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.844734 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-abc0-account-create-update-74tjb"] Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.878031 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-wk8xf"] Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.879347 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wk8xf" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.888662 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wk8xf"] Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.904632 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-42dt5" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.954239 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef3f761-a29f-493f-8ace-68422774cc27-operator-scripts\") pod \"barbican-abc0-account-create-update-74tjb\" (UID: \"6ef3f761-a29f-493f-8ace-68422774cc27\") " pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.954374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjjh\" (UniqueName: \"kubernetes.io/projected/b805f491-7378-4eae-8c43-af5315818571-kube-api-access-9mjjh\") pod \"cinder-db-create-wk8xf\" (UID: \"b805f491-7378-4eae-8c43-af5315818571\") " pod="openstack/cinder-db-create-wk8xf" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.954437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68v98\" (UniqueName: \"kubernetes.io/projected/6ef3f761-a29f-493f-8ace-68422774cc27-kube-api-access-68v98\") pod \"barbican-abc0-account-create-update-74tjb\" (UID: \"6ef3f761-a29f-493f-8ace-68422774cc27\") " pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.954484 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b805f491-7378-4eae-8c43-af5315818571-operator-scripts\") pod \"cinder-db-create-wk8xf\" (UID: \"b805f491-7378-4eae-8c43-af5315818571\") " pod="openstack/cinder-db-create-wk8xf" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.989766 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-drwnp"] Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.999037 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-drwnp" Dec 01 09:32:53 crc kubenswrapper[4763]: I1201 09:32:53.999143 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-drwnp"] Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.056637 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68v98\" (UniqueName: \"kubernetes.io/projected/6ef3f761-a29f-493f-8ace-68422774cc27-kube-api-access-68v98\") pod \"barbican-abc0-account-create-update-74tjb\" (UID: \"6ef3f761-a29f-493f-8ace-68422774cc27\") " pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.056682 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b805f491-7378-4eae-8c43-af5315818571-operator-scripts\") pod \"cinder-db-create-wk8xf\" (UID: \"b805f491-7378-4eae-8c43-af5315818571\") " pod="openstack/cinder-db-create-wk8xf" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.056731 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef3f761-a29f-493f-8ace-68422774cc27-operator-scripts\") pod \"barbican-abc0-account-create-update-74tjb\" (UID: \"6ef3f761-a29f-493f-8ace-68422774cc27\") " pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.056778 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hp6t\" (UniqueName: \"kubernetes.io/projected/85027a6d-3549-4e56-80b6-b8133b118eae-kube-api-access-4hp6t\") pod \"neutron-db-create-drwnp\" (UID: \"85027a6d-3549-4e56-80b6-b8133b118eae\") " pod="openstack/neutron-db-create-drwnp" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.056800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85027a6d-3549-4e56-80b6-b8133b118eae-operator-scripts\") pod \"neutron-db-create-drwnp\" (UID: \"85027a6d-3549-4e56-80b6-b8133b118eae\") " pod="openstack/neutron-db-create-drwnp" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.056888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjjh\" (UniqueName: \"kubernetes.io/projected/b805f491-7378-4eae-8c43-af5315818571-kube-api-access-9mjjh\") pod \"cinder-db-create-wk8xf\" (UID: \"b805f491-7378-4eae-8c43-af5315818571\") " pod="openstack/cinder-db-create-wk8xf" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.057866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b805f491-7378-4eae-8c43-af5315818571-operator-scripts\") pod \"cinder-db-create-wk8xf\" (UID: \"b805f491-7378-4eae-8c43-af5315818571\") " pod="openstack/cinder-db-create-wk8xf" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.058307 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef3f761-a29f-493f-8ace-68422774cc27-operator-scripts\") pod \"barbican-abc0-account-create-update-74tjb\" (UID: \"6ef3f761-a29f-493f-8ace-68422774cc27\") " pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.082705 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-485b-account-create-update-6l8rt"] Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.089320 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.101121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjjh\" (UniqueName: \"kubernetes.io/projected/b805f491-7378-4eae-8c43-af5315818571-kube-api-access-9mjjh\") pod \"cinder-db-create-wk8xf\" (UID: \"b805f491-7378-4eae-8c43-af5315818571\") " pod="openstack/cinder-db-create-wk8xf" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.102892 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-485b-account-create-update-6l8rt"] Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.112202 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.121072 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68v98\" (UniqueName: \"kubernetes.io/projected/6ef3f761-a29f-493f-8ace-68422774cc27-kube-api-access-68v98\") pod \"barbican-abc0-account-create-update-74tjb\" (UID: \"6ef3f761-a29f-493f-8ace-68422774cc27\") " pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.158141 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-operator-scripts\") pod \"cinder-485b-account-create-update-6l8rt\" (UID: \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\") " pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.158230 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hp6t\" (UniqueName: \"kubernetes.io/projected/85027a6d-3549-4e56-80b6-b8133b118eae-kube-api-access-4hp6t\") pod \"neutron-db-create-drwnp\" (UID: \"85027a6d-3549-4e56-80b6-b8133b118eae\") " pod="openstack/neutron-db-create-drwnp" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.158268 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85027a6d-3549-4e56-80b6-b8133b118eae-operator-scripts\") pod \"neutron-db-create-drwnp\" (UID: \"85027a6d-3549-4e56-80b6-b8133b118eae\") " pod="openstack/neutron-db-create-drwnp" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.158296 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl8zd\" (UniqueName: \"kubernetes.io/projected/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-kube-api-access-dl8zd\") pod \"cinder-485b-account-create-update-6l8rt\" (UID: \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\") " pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.159579 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85027a6d-3549-4e56-80b6-b8133b118eae-operator-scripts\") pod \"neutron-db-create-drwnp\" (UID: \"85027a6d-3549-4e56-80b6-b8133b118eae\") " pod="openstack/neutron-db-create-drwnp" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.171942 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.182194 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hp6t\" (UniqueName: \"kubernetes.io/projected/85027a6d-3549-4e56-80b6-b8133b118eae-kube-api-access-4hp6t\") pod \"neutron-db-create-drwnp\" (UID: \"85027a6d-3549-4e56-80b6-b8133b118eae\") " pod="openstack/neutron-db-create-drwnp" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.214499 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wk8xf" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.234510 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4e18-account-create-update-b6f8x"] Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.235511 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.244524 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4e18-account-create-update-b6f8x"] Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.251314 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.259939 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-operator-scripts\") pod \"cinder-485b-account-create-update-6l8rt\" (UID: \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\") " pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.260079 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl8zd\" (UniqueName: \"kubernetes.io/projected/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-kube-api-access-dl8zd\") pod \"cinder-485b-account-create-update-6l8rt\" (UID: \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\") " pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.260253 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-operator-scripts\") pod \"cinder-485b-account-create-update-6l8rt\" (UID: \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\") " pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.287370 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl8zd\" (UniqueName: \"kubernetes.io/projected/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-kube-api-access-dl8zd\") pod \"cinder-485b-account-create-update-6l8rt\" (UID: \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\") " pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.324006 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-drwnp" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.361145 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-m2rxm"] Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.361590 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqpz8\" (UniqueName: \"kubernetes.io/projected/05241c5a-6373-4ec7-a9fc-737e27036bb0-kube-api-access-rqpz8\") pod \"neutron-4e18-account-create-update-b6f8x\" (UID: \"05241c5a-6373-4ec7-a9fc-737e27036bb0\") " pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.361729 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05241c5a-6373-4ec7-a9fc-737e27036bb0-operator-scripts\") pod \"neutron-4e18-account-create-update-b6f8x\" (UID: \"05241c5a-6373-4ec7-a9fc-737e27036bb0\") " pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.368359 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.374329 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.374571 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.374686 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w8tqv" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.374788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.395130 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-m2rxm"] Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.453112 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-42dt5"] Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.463264 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfphz\" (UniqueName: \"kubernetes.io/projected/b221a534-cb94-42b4-af86-da3c66d827d5-kube-api-access-zfphz\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.463305 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-config-data\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.463339 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqpz8\" (UniqueName: \"kubernetes.io/projected/05241c5a-6373-4ec7-a9fc-737e27036bb0-kube-api-access-rqpz8\") pod \"neutron-4e18-account-create-update-b6f8x\" (UID: \"05241c5a-6373-4ec7-a9fc-737e27036bb0\") " pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.463396 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-combined-ca-bundle\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.463433 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05241c5a-6373-4ec7-a9fc-737e27036bb0-operator-scripts\") pod \"neutron-4e18-account-create-update-b6f8x\" (UID: \"05241c5a-6373-4ec7-a9fc-737e27036bb0\") " pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.464183 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05241c5a-6373-4ec7-a9fc-737e27036bb0-operator-scripts\") pod \"neutron-4e18-account-create-update-b6f8x\" (UID: \"05241c5a-6373-4ec7-a9fc-737e27036bb0\") " pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.493044 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.570031 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfphz\" (UniqueName: \"kubernetes.io/projected/b221a534-cb94-42b4-af86-da3c66d827d5-kube-api-access-zfphz\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.570709 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-config-data\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.570845 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-combined-ca-bundle\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.654100 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfphz\" (UniqueName: \"kubernetes.io/projected/b221a534-cb94-42b4-af86-da3c66d827d5-kube-api-access-zfphz\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.655189 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-config-data\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.656510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-combined-ca-bundle\") pod \"keystone-db-sync-m2rxm\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.656786 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqpz8\" (UniqueName: \"kubernetes.io/projected/05241c5a-6373-4ec7-a9fc-737e27036bb0-kube-api-access-rqpz8\") pod \"neutron-4e18-account-create-update-b6f8x\" (UID: \"05241c5a-6373-4ec7-a9fc-737e27036bb0\") " pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.717795 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.867807 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:32:54 crc kubenswrapper[4763]: I1201 09:32:54.891143 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-abc0-account-create-update-74tjb"] Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.014401 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wk8xf"] Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.204915 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-m2rxm"] Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.283998 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-485b-account-create-update-6l8rt"] Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.306861 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-drwnp"] Dec 01 09:32:55 crc kubenswrapper[4763]: W1201 09:32:55.323155 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85027a6d_3549_4e56_80b6_b8133b118eae.slice/crio-ab9cad0481027383fcd782e4a73a9fc6583117e49f98d6e2592dab5d66a67e1b WatchSource:0}: Error finding container ab9cad0481027383fcd782e4a73a9fc6583117e49f98d6e2592dab5d66a67e1b: Status 404 returned error can't find the container with id ab9cad0481027383fcd782e4a73a9fc6583117e49f98d6e2592dab5d66a67e1b Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.583271 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-42dt5" event={"ID":"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740","Type":"ContainerStarted","Data":"337aad30eae8e92710957f5d8af1dced83f62a1d549f8559a77714f4cf87bef9"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.583604 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-42dt5" event={"ID":"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740","Type":"ContainerStarted","Data":"233be920b507b2154a65da8767d4aa7dbfd14cacf1f53c70ebec6e5037ccc748"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.594666 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-abc0-account-create-update-74tjb" event={"ID":"6ef3f761-a29f-493f-8ace-68422774cc27","Type":"ContainerStarted","Data":"c62d1f1acd610597fd9d3c1da14c73ed1475bfe8907cab68703f4782777d78e5"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.594706 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-abc0-account-create-update-74tjb" event={"ID":"6ef3f761-a29f-493f-8ace-68422774cc27","Type":"ContainerStarted","Data":"de7813519dd1700b13269024b99e303420b54ee19e8ff68328de102f5f052131"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.600506 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m2rxm" event={"ID":"b221a534-cb94-42b4-af86-da3c66d827d5","Type":"ContainerStarted","Data":"0b0411130401f14c526124b5f9b42f929b38021c589d4bdcd191eb603a269c1a"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.605771 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-485b-account-create-update-6l8rt" event={"ID":"dc8d0192-f2bf-4dfb-b7b7-fba83523017e","Type":"ContainerStarted","Data":"e37beafcdc97dcbfd0a7626e9a9bc5a77f0abe7b6dcc633b88918edcd967be40"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.605815 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-485b-account-create-update-6l8rt" event={"ID":"dc8d0192-f2bf-4dfb-b7b7-fba83523017e","Type":"ContainerStarted","Data":"6ec98b2c19661f5ec505a384a051e3003a0dc0b0cddefdb028e45477d6eeb9bd"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.605848 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-42dt5" podStartSLOduration=2.605828173 podStartE2EDuration="2.605828173s" podCreationTimestamp="2025-12-01 09:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:55.598882213 +0000 UTC m=+1092.867530981" watchObservedRunningTime="2025-12-01 09:32:55.605828173 +0000 UTC m=+1092.874476941" Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.614435 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-drwnp" event={"ID":"85027a6d-3549-4e56-80b6-b8133b118eae","Type":"ContainerStarted","Data":"ab9cad0481027383fcd782e4a73a9fc6583117e49f98d6e2592dab5d66a67e1b"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.620614 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wk8xf" event={"ID":"b805f491-7378-4eae-8c43-af5315818571","Type":"ContainerStarted","Data":"059680f31240098fe413b8617fe9d99490758b5f2086fc49ac1d036b45555084"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.620682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wk8xf" event={"ID":"b805f491-7378-4eae-8c43-af5315818571","Type":"ContainerStarted","Data":"32937dd60a97e2633722c75ee71a0995d2cc0a3d9cc58ce97854ea055b2e8ee7"} Dec 01 09:32:55 crc kubenswrapper[4763]: I1201 09:32:55.664830 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4e18-account-create-update-b6f8x"] Dec 01 09:32:56 crc kubenswrapper[4763]: I1201 09:32:56.645264 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e18-account-create-update-b6f8x" event={"ID":"05241c5a-6373-4ec7-a9fc-737e27036bb0","Type":"ContainerStarted","Data":"ae71100dd4a342ac7b18e44eb49d8dff2017203a3eac025a57e43236d1d7787d"} Dec 01 09:32:56 crc kubenswrapper[4763]: I1201 09:32:56.645544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e18-account-create-update-b6f8x" event={"ID":"05241c5a-6373-4ec7-a9fc-737e27036bb0","Type":"ContainerStarted","Data":"d5b86692bcda0a1645b131315986ffbd1e7a159e09482df09f960958ae6a2006"} Dec 01 09:32:56 crc kubenswrapper[4763]: I1201 09:32:56.650111 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-drwnp" event={"ID":"85027a6d-3549-4e56-80b6-b8133b118eae","Type":"ContainerStarted","Data":"426f6c4e94728dd50e1965a4abfbc0e39468a9a7931ff72315cfc98f7bfb584f"} Dec 01 09:32:56 crc kubenswrapper[4763]: I1201 09:32:56.671150 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-4e18-account-create-update-b6f8x" podStartSLOduration=2.671129966 podStartE2EDuration="2.671129966s" podCreationTimestamp="2025-12-01 09:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:56.660912637 +0000 UTC m=+1093.929561405" watchObservedRunningTime="2025-12-01 09:32:56.671129966 +0000 UTC m=+1093.939778734" Dec 01 09:32:56 crc kubenswrapper[4763]: I1201 09:32:56.686358 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-485b-account-create-update-6l8rt" podStartSLOduration=2.686341368 podStartE2EDuration="2.686341368s" podCreationTimestamp="2025-12-01 09:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:56.679914583 +0000 UTC m=+1093.948563341" watchObservedRunningTime="2025-12-01 09:32:56.686341368 +0000 UTC m=+1093.954990136" Dec 01 09:32:56 crc kubenswrapper[4763]: I1201 09:32:56.740523 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-wk8xf" podStartSLOduration=3.74050242 podStartE2EDuration="3.74050242s" podCreationTimestamp="2025-12-01 09:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:56.739965796 +0000 UTC m=+1094.008614564" watchObservedRunningTime="2025-12-01 09:32:56.74050242 +0000 UTC m=+1094.009151198" Dec 01 09:32:56 crc kubenswrapper[4763]: I1201 09:32:56.741131 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-drwnp" podStartSLOduration=3.741124086 podStartE2EDuration="3.741124086s" podCreationTimestamp="2025-12-01 09:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:56.71175848 +0000 UTC m=+1093.980407248" watchObservedRunningTime="2025-12-01 09:32:56.741124086 +0000 UTC m=+1094.009772854" Dec 01 09:32:56 crc kubenswrapper[4763]: I1201 09:32:56.762860 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-abc0-account-create-update-74tjb" podStartSLOduration=3.762840985 podStartE2EDuration="3.762840985s" podCreationTimestamp="2025-12-01 09:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:32:56.757499978 +0000 UTC m=+1094.026148746" watchObservedRunningTime="2025-12-01 09:32:56.762840985 +0000 UTC m=+1094.031489753" Dec 01 09:32:57 crc kubenswrapper[4763]: I1201 09:32:57.684317 4763 generic.go:334] "Generic (PLEG): container finished" podID="b805f491-7378-4eae-8c43-af5315818571" containerID="059680f31240098fe413b8617fe9d99490758b5f2086fc49ac1d036b45555084" exitCode=0 Dec 01 09:32:57 crc kubenswrapper[4763]: I1201 09:32:57.684854 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wk8xf" event={"ID":"b805f491-7378-4eae-8c43-af5315818571","Type":"ContainerDied","Data":"059680f31240098fe413b8617fe9d99490758b5f2086fc49ac1d036b45555084"} Dec 01 09:32:58 crc kubenswrapper[4763]: E1201 09:32:58.219980 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a0b7a2_5ddf_4b28_8d66_73c0b126b740.slice/crio-337aad30eae8e92710957f5d8af1dced83f62a1d549f8559a77714f4cf87bef9.scope\": RecentStats: unable to find data in memory cache]" Dec 01 09:32:58 crc kubenswrapper[4763]: I1201 09:32:58.697439 4763 generic.go:334] "Generic (PLEG): container finished" podID="85027a6d-3549-4e56-80b6-b8133b118eae" containerID="426f6c4e94728dd50e1965a4abfbc0e39468a9a7931ff72315cfc98f7bfb584f" exitCode=0 Dec 01 09:32:58 crc kubenswrapper[4763]: I1201 09:32:58.698510 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-drwnp" event={"ID":"85027a6d-3549-4e56-80b6-b8133b118eae","Type":"ContainerDied","Data":"426f6c4e94728dd50e1965a4abfbc0e39468a9a7931ff72315cfc98f7bfb584f"} Dec 01 09:32:58 crc kubenswrapper[4763]: I1201 09:32:58.701077 4763 generic.go:334] "Generic (PLEG): container finished" podID="d0a0b7a2-5ddf-4b28-8d66-73c0b126b740" containerID="337aad30eae8e92710957f5d8af1dced83f62a1d549f8559a77714f4cf87bef9" exitCode=0 Dec 01 09:32:58 crc kubenswrapper[4763]: I1201 09:32:58.701287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-42dt5" event={"ID":"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740","Type":"ContainerDied","Data":"337aad30eae8e92710957f5d8af1dced83f62a1d549f8559a77714f4cf87bef9"} Dec 01 09:33:00 crc kubenswrapper[4763]: I1201 09:33:00.718246 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ef3f761-a29f-493f-8ace-68422774cc27" containerID="c62d1f1acd610597fd9d3c1da14c73ed1475bfe8907cab68703f4782777d78e5" exitCode=0 Dec 01 09:33:00 crc kubenswrapper[4763]: I1201 09:33:00.718847 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-abc0-account-create-update-74tjb" event={"ID":"6ef3f761-a29f-493f-8ace-68422774cc27","Type":"ContainerDied","Data":"c62d1f1acd610597fd9d3c1da14c73ed1475bfe8907cab68703f4782777d78e5"} Dec 01 09:33:00 crc kubenswrapper[4763]: I1201 09:33:00.721049 4763 generic.go:334] "Generic (PLEG): container finished" podID="dc8d0192-f2bf-4dfb-b7b7-fba83523017e" containerID="e37beafcdc97dcbfd0a7626e9a9bc5a77f0abe7b6dcc633b88918edcd967be40" exitCode=0 Dec 01 09:33:00 crc kubenswrapper[4763]: I1201 09:33:00.721219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-485b-account-create-update-6l8rt" event={"ID":"dc8d0192-f2bf-4dfb-b7b7-fba83523017e","Type":"ContainerDied","Data":"e37beafcdc97dcbfd0a7626e9a9bc5a77f0abe7b6dcc633b88918edcd967be40"} Dec 01 09:33:01 crc kubenswrapper[4763]: I1201 09:33:01.734243 4763 generic.go:334] "Generic (PLEG): container finished" podID="05241c5a-6373-4ec7-a9fc-737e27036bb0" containerID="ae71100dd4a342ac7b18e44eb49d8dff2017203a3eac025a57e43236d1d7787d" exitCode=0 Dec 01 09:33:01 crc kubenswrapper[4763]: I1201 09:33:01.734356 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e18-account-create-update-b6f8x" event={"ID":"05241c5a-6373-4ec7-a9fc-737e27036bb0","Type":"ContainerDied","Data":"ae71100dd4a342ac7b18e44eb49d8dff2017203a3eac025a57e43236d1d7787d"} Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.234217 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-drwnp" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.243149 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wk8xf" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.253960 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-42dt5" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.285173 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.291990 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.367363 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mjjh\" (UniqueName: \"kubernetes.io/projected/b805f491-7378-4eae-8c43-af5315818571-kube-api-access-9mjjh\") pod \"b805f491-7378-4eae-8c43-af5315818571\" (UID: \"b805f491-7378-4eae-8c43-af5315818571\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.367406 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85027a6d-3549-4e56-80b6-b8133b118eae-operator-scripts\") pod \"85027a6d-3549-4e56-80b6-b8133b118eae\" (UID: \"85027a6d-3549-4e56-80b6-b8133b118eae\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.367461 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v9dw\" (UniqueName: \"kubernetes.io/projected/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-kube-api-access-8v9dw\") pod \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\" (UID: \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.367633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hp6t\" (UniqueName: \"kubernetes.io/projected/85027a6d-3549-4e56-80b6-b8133b118eae-kube-api-access-4hp6t\") pod \"85027a6d-3549-4e56-80b6-b8133b118eae\" (UID: \"85027a6d-3549-4e56-80b6-b8133b118eae\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.367673 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b805f491-7378-4eae-8c43-af5315818571-operator-scripts\") pod \"b805f491-7378-4eae-8c43-af5315818571\" (UID: \"b805f491-7378-4eae-8c43-af5315818571\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.367713 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-operator-scripts\") pod \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\" (UID: \"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.368377 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b805f491-7378-4eae-8c43-af5315818571-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b805f491-7378-4eae-8c43-af5315818571" (UID: "b805f491-7378-4eae-8c43-af5315818571"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.368429 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0a0b7a2-5ddf-4b28-8d66-73c0b126b740" (UID: "d0a0b7a2-5ddf-4b28-8d66-73c0b126b740"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.368418 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85027a6d-3549-4e56-80b6-b8133b118eae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85027a6d-3549-4e56-80b6-b8133b118eae" (UID: "85027a6d-3549-4e56-80b6-b8133b118eae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.373302 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-kube-api-access-8v9dw" (OuterVolumeSpecName: "kube-api-access-8v9dw") pod "d0a0b7a2-5ddf-4b28-8d66-73c0b126b740" (UID: "d0a0b7a2-5ddf-4b28-8d66-73c0b126b740"). InnerVolumeSpecName "kube-api-access-8v9dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.379168 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85027a6d-3549-4e56-80b6-b8133b118eae-kube-api-access-4hp6t" (OuterVolumeSpecName: "kube-api-access-4hp6t") pod "85027a6d-3549-4e56-80b6-b8133b118eae" (UID: "85027a6d-3549-4e56-80b6-b8133b118eae"). InnerVolumeSpecName "kube-api-access-4hp6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.379262 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b805f491-7378-4eae-8c43-af5315818571-kube-api-access-9mjjh" (OuterVolumeSpecName: "kube-api-access-9mjjh") pod "b805f491-7378-4eae-8c43-af5315818571" (UID: "b805f491-7378-4eae-8c43-af5315818571"). InnerVolumeSpecName "kube-api-access-9mjjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.469020 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef3f761-a29f-493f-8ace-68422774cc27-operator-scripts\") pod \"6ef3f761-a29f-493f-8ace-68422774cc27\" (UID: \"6ef3f761-a29f-493f-8ace-68422774cc27\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.469354 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68v98\" (UniqueName: \"kubernetes.io/projected/6ef3f761-a29f-493f-8ace-68422774cc27-kube-api-access-68v98\") pod \"6ef3f761-a29f-493f-8ace-68422774cc27\" (UID: \"6ef3f761-a29f-493f-8ace-68422774cc27\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.469376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl8zd\" (UniqueName: \"kubernetes.io/projected/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-kube-api-access-dl8zd\") pod \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\" (UID: \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.469384 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ef3f761-a29f-493f-8ace-68422774cc27-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ef3f761-a29f-493f-8ace-68422774cc27" (UID: "6ef3f761-a29f-493f-8ace-68422774cc27"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.469476 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-operator-scripts\") pod \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\" (UID: \"dc8d0192-f2bf-4dfb-b7b7-fba83523017e\") " Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.470231 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v9dw\" (UniqueName: \"kubernetes.io/projected/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-kube-api-access-8v9dw\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.470253 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ef3f761-a29f-493f-8ace-68422774cc27-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.470262 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hp6t\" (UniqueName: \"kubernetes.io/projected/85027a6d-3549-4e56-80b6-b8133b118eae-kube-api-access-4hp6t\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.470272 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b805f491-7378-4eae-8c43-af5315818571-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.470282 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.470290 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mjjh\" (UniqueName: \"kubernetes.io/projected/b805f491-7378-4eae-8c43-af5315818571-kube-api-access-9mjjh\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.470298 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85027a6d-3549-4e56-80b6-b8133b118eae-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.470703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc8d0192-f2bf-4dfb-b7b7-fba83523017e" (UID: "dc8d0192-f2bf-4dfb-b7b7-fba83523017e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.474306 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-kube-api-access-dl8zd" (OuterVolumeSpecName: "kube-api-access-dl8zd") pod "dc8d0192-f2bf-4dfb-b7b7-fba83523017e" (UID: "dc8d0192-f2bf-4dfb-b7b7-fba83523017e"). InnerVolumeSpecName "kube-api-access-dl8zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.474597 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef3f761-a29f-493f-8ace-68422774cc27-kube-api-access-68v98" (OuterVolumeSpecName: "kube-api-access-68v98") pod "6ef3f761-a29f-493f-8ace-68422774cc27" (UID: "6ef3f761-a29f-493f-8ace-68422774cc27"). InnerVolumeSpecName "kube-api-access-68v98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.571863 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68v98\" (UniqueName: \"kubernetes.io/projected/6ef3f761-a29f-493f-8ace-68422774cc27-kube-api-access-68v98\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.571892 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl8zd\" (UniqueName: \"kubernetes.io/projected/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-kube-api-access-dl8zd\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.571902 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc8d0192-f2bf-4dfb-b7b7-fba83523017e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.744645 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m2rxm" event={"ID":"b221a534-cb94-42b4-af86-da3c66d827d5","Type":"ContainerStarted","Data":"d13a60015043ad5554df404d33189aa6990d47b71ad546a479a1c8654d6a03d8"} Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.746683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-485b-account-create-update-6l8rt" event={"ID":"dc8d0192-f2bf-4dfb-b7b7-fba83523017e","Type":"ContainerDied","Data":"6ec98b2c19661f5ec505a384a051e3003a0dc0b0cddefdb028e45477d6eeb9bd"} Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.746729 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ec98b2c19661f5ec505a384a051e3003a0dc0b0cddefdb028e45477d6eeb9bd" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.746733 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-485b-account-create-update-6l8rt" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.748011 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-drwnp" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.748081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-drwnp" event={"ID":"85027a6d-3549-4e56-80b6-b8133b118eae","Type":"ContainerDied","Data":"ab9cad0481027383fcd782e4a73a9fc6583117e49f98d6e2592dab5d66a67e1b"} Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.748126 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9cad0481027383fcd782e4a73a9fc6583117e49f98d6e2592dab5d66a67e1b" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.749558 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wk8xf" event={"ID":"b805f491-7378-4eae-8c43-af5315818571","Type":"ContainerDied","Data":"32937dd60a97e2633722c75ee71a0995d2cc0a3d9cc58ce97854ea055b2e8ee7"} Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.749688 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32937dd60a97e2633722c75ee71a0995d2cc0a3d9cc58ce97854ea055b2e8ee7" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.749634 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wk8xf" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.750736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-42dt5" event={"ID":"d0a0b7a2-5ddf-4b28-8d66-73c0b126b740","Type":"ContainerDied","Data":"233be920b507b2154a65da8767d4aa7dbfd14cacf1f53c70ebec6e5037ccc748"} Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.750768 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="233be920b507b2154a65da8767d4aa7dbfd14cacf1f53c70ebec6e5037ccc748" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.750745 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-42dt5" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.751943 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-abc0-account-create-update-74tjb" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.751969 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-abc0-account-create-update-74tjb" event={"ID":"6ef3f761-a29f-493f-8ace-68422774cc27","Type":"ContainerDied","Data":"de7813519dd1700b13269024b99e303420b54ee19e8ff68328de102f5f052131"} Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.752122 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de7813519dd1700b13269024b99e303420b54ee19e8ff68328de102f5f052131" Dec 01 09:33:02 crc kubenswrapper[4763]: I1201 09:33:02.784234 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-m2rxm" podStartSLOduration=1.59981583 podStartE2EDuration="8.784216589s" podCreationTimestamp="2025-12-01 09:32:54 +0000 UTC" firstStartedPulling="2025-12-01 09:32:55.232743683 +0000 UTC m=+1092.501392451" lastFinishedPulling="2025-12-01 09:33:02.417144442 +0000 UTC m=+1099.685793210" observedRunningTime="2025-12-01 09:33:02.773096375 +0000 UTC m=+1100.041745143" watchObservedRunningTime="2025-12-01 09:33:02.784216589 +0000 UTC m=+1100.052865347" Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.039836 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.199297 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqpz8\" (UniqueName: \"kubernetes.io/projected/05241c5a-6373-4ec7-a9fc-737e27036bb0-kube-api-access-rqpz8\") pod \"05241c5a-6373-4ec7-a9fc-737e27036bb0\" (UID: \"05241c5a-6373-4ec7-a9fc-737e27036bb0\") " Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.199535 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05241c5a-6373-4ec7-a9fc-737e27036bb0-operator-scripts\") pod \"05241c5a-6373-4ec7-a9fc-737e27036bb0\" (UID: \"05241c5a-6373-4ec7-a9fc-737e27036bb0\") " Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.200264 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05241c5a-6373-4ec7-a9fc-737e27036bb0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05241c5a-6373-4ec7-a9fc-737e27036bb0" (UID: "05241c5a-6373-4ec7-a9fc-737e27036bb0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.204060 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05241c5a-6373-4ec7-a9fc-737e27036bb0-kube-api-access-rqpz8" (OuterVolumeSpecName: "kube-api-access-rqpz8") pod "05241c5a-6373-4ec7-a9fc-737e27036bb0" (UID: "05241c5a-6373-4ec7-a9fc-737e27036bb0"). InnerVolumeSpecName "kube-api-access-rqpz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.301841 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqpz8\" (UniqueName: \"kubernetes.io/projected/05241c5a-6373-4ec7-a9fc-737e27036bb0-kube-api-access-rqpz8\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.301878 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05241c5a-6373-4ec7-a9fc-737e27036bb0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.762157 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4e18-account-create-update-b6f8x" Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.762199 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4e18-account-create-update-b6f8x" event={"ID":"05241c5a-6373-4ec7-a9fc-737e27036bb0","Type":"ContainerDied","Data":"d5b86692bcda0a1645b131315986ffbd1e7a159e09482df09f960958ae6a2006"} Dec 01 09:33:03 crc kubenswrapper[4763]: I1201 09:33:03.762539 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5b86692bcda0a1645b131315986ffbd1e7a159e09482df09f960958ae6a2006" Dec 01 09:33:10 crc kubenswrapper[4763]: I1201 09:33:10.823764 4763 generic.go:334] "Generic (PLEG): container finished" podID="b221a534-cb94-42b4-af86-da3c66d827d5" containerID="d13a60015043ad5554df404d33189aa6990d47b71ad546a479a1c8654d6a03d8" exitCode=0 Dec 01 09:33:10 crc kubenswrapper[4763]: I1201 09:33:10.824326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m2rxm" event={"ID":"b221a534-cb94-42b4-af86-da3c66d827d5","Type":"ContainerDied","Data":"d13a60015043ad5554df404d33189aa6990d47b71ad546a479a1c8654d6a03d8"} Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.168358 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.267850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-config-data\") pod \"b221a534-cb94-42b4-af86-da3c66d827d5\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.268025 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-combined-ca-bundle\") pod \"b221a534-cb94-42b4-af86-da3c66d827d5\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.268086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfphz\" (UniqueName: \"kubernetes.io/projected/b221a534-cb94-42b4-af86-da3c66d827d5-kube-api-access-zfphz\") pod \"b221a534-cb94-42b4-af86-da3c66d827d5\" (UID: \"b221a534-cb94-42b4-af86-da3c66d827d5\") " Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.273620 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b221a534-cb94-42b4-af86-da3c66d827d5-kube-api-access-zfphz" (OuterVolumeSpecName: "kube-api-access-zfphz") pod "b221a534-cb94-42b4-af86-da3c66d827d5" (UID: "b221a534-cb94-42b4-af86-da3c66d827d5"). InnerVolumeSpecName "kube-api-access-zfphz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.291844 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b221a534-cb94-42b4-af86-da3c66d827d5" (UID: "b221a534-cb94-42b4-af86-da3c66d827d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.321757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-config-data" (OuterVolumeSpecName: "config-data") pod "b221a534-cb94-42b4-af86-da3c66d827d5" (UID: "b221a534-cb94-42b4-af86-da3c66d827d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.369360 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.369384 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfphz\" (UniqueName: \"kubernetes.io/projected/b221a534-cb94-42b4-af86-da3c66d827d5-kube-api-access-zfphz\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.369416 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221a534-cb94-42b4-af86-da3c66d827d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.838423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m2rxm" event={"ID":"b221a534-cb94-42b4-af86-da3c66d827d5","Type":"ContainerDied","Data":"0b0411130401f14c526124b5f9b42f929b38021c589d4bdcd191eb603a269c1a"} Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.838779 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0411130401f14c526124b5f9b42f929b38021c589d4bdcd191eb603a269c1a" Dec 01 09:33:12 crc kubenswrapper[4763]: I1201 09:33:12.838481 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m2rxm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.178264 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-b25qm"] Dec 01 09:33:13 crc kubenswrapper[4763]: E1201 09:33:13.178680 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b805f491-7378-4eae-8c43-af5315818571" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.178697 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b805f491-7378-4eae-8c43-af5315818571" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: E1201 09:33:13.178708 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b221a534-cb94-42b4-af86-da3c66d827d5" containerName="keystone-db-sync" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.178715 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b221a534-cb94-42b4-af86-da3c66d827d5" containerName="keystone-db-sync" Dec 01 09:33:13 crc kubenswrapper[4763]: E1201 09:33:13.178730 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8d0192-f2bf-4dfb-b7b7-fba83523017e" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.178738 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8d0192-f2bf-4dfb-b7b7-fba83523017e" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: E1201 09:33:13.178747 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05241c5a-6373-4ec7-a9fc-737e27036bb0" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.178755 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="05241c5a-6373-4ec7-a9fc-737e27036bb0" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: E1201 09:33:13.178783 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a0b7a2-5ddf-4b28-8d66-73c0b126b740" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.178790 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a0b7a2-5ddf-4b28-8d66-73c0b126b740" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: E1201 09:33:13.178803 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef3f761-a29f-493f-8ace-68422774cc27" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.178811 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef3f761-a29f-493f-8ace-68422774cc27" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: E1201 09:33:13.178824 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85027a6d-3549-4e56-80b6-b8133b118eae" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.178831 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="85027a6d-3549-4e56-80b6-b8133b118eae" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.179035 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="05241c5a-6373-4ec7-a9fc-737e27036bb0" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.179057 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b221a534-cb94-42b4-af86-da3c66d827d5" containerName="keystone-db-sync" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.179073 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a0b7a2-5ddf-4b28-8d66-73c0b126b740" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.179086 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef3f761-a29f-493f-8ace-68422774cc27" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.179097 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8d0192-f2bf-4dfb-b7b7-fba83523017e" containerName="mariadb-account-create-update" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.179109 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="85027a6d-3549-4e56-80b6-b8133b118eae" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.179122 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b805f491-7378-4eae-8c43-af5315818571" containerName="mariadb-database-create" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.181110 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.192104 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl7qh\" (UniqueName: \"kubernetes.io/projected/4b2b04f8-dfab-4508-be76-658e400c4b7d-kube-api-access-nl7qh\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.192345 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-config\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.192393 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-dns-svc\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.192443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-nb\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.192489 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-sb\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.209083 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-b25qm"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.242872 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qfths"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.246428 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.251039 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w8tqv" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.251352 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.269935 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.270171 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.270337 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.276002 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfths"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294602 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl7qh\" (UniqueName: \"kubernetes.io/projected/4b2b04f8-dfab-4508-be76-658e400c4b7d-kube-api-access-nl7qh\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294660 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-fernet-keys\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294705 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-config-data\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294722 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nbfc\" (UniqueName: \"kubernetes.io/projected/45bce80f-f262-4afe-ab9c-47a638e02256-kube-api-access-9nbfc\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-config\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294802 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-dns-svc\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294842 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-combined-ca-bundle\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294901 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-nb\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294955 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-sb\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.294986 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-scripts\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.295013 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-credential-keys\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.295587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-config\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.295784 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-nb\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.300976 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-sb\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.302101 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-dns-svc\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.357382 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl7qh\" (UniqueName: \"kubernetes.io/projected/4b2b04f8-dfab-4508-be76-658e400c4b7d-kube-api-access-nl7qh\") pod \"dnsmasq-dns-66fbd85b65-b25qm\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.396296 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-scripts\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.396352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-credential-keys\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.396443 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-fernet-keys\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.396518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-config-data\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.396543 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nbfc\" (UniqueName: \"kubernetes.io/projected/45bce80f-f262-4afe-ab9c-47a638e02256-kube-api-access-9nbfc\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.396575 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-combined-ca-bundle\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.412508 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-combined-ca-bundle\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.412583 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-scripts\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.412791 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-config-data\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.431283 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-credential-keys\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.459414 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-fernet-keys\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.480364 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nbfc\" (UniqueName: \"kubernetes.io/projected/45bce80f-f262-4afe-ab9c-47a638e02256-kube-api-access-9nbfc\") pod \"keystone-bootstrap-qfths\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.532247 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.574642 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vh7wz"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.576150 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.583329 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xnlhk" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.583499 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.583584 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.584648 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.606643 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vh7wz"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.620717 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.622671 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.636450 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.638579 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.662225 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.707012 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-db-sync-config-data\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.707078 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-combined-ca-bundle\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.707100 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-scripts\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.707147 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br8c4\" (UniqueName: \"kubernetes.io/projected/ce04c2cb-ee0d-4530-8007-a853f1d4e785-kube-api-access-br8c4\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.707186 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce04c2cb-ee0d-4530-8007-a853f1d4e785-etc-machine-id\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.707204 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-config-data\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.740029 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xhw7j"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.741299 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.744805 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.754799 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xhw7j"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.756283 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dfh4h" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.769784 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-v52zn"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.770920 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.776579 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.777010 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vj8x6" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.778315 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.797514 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v52zn"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810645 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-db-sync-config-data\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-run-httpd\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810743 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-config-data\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-combined-ca-bundle\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810808 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-scripts\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810862 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br8c4\" (UniqueName: \"kubernetes.io/projected/ce04c2cb-ee0d-4530-8007-a853f1d4e785-kube-api-access-br8c4\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810906 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vdq8\" (UniqueName: \"kubernetes.io/projected/c135d6bf-430b-4fac-9b05-470df1e82e01-kube-api-access-5vdq8\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810947 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-scripts\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810969 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.810992 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce04c2cb-ee0d-4530-8007-a853f1d4e785-etc-machine-id\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.811023 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-log-httpd\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.811041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-config-data\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.877062 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-combined-ca-bundle\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.877510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce04c2cb-ee0d-4530-8007-a853f1d4e785-etc-machine-id\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.908287 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-scripts\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.908604 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-db-sync-config-data\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.912852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-combined-ca-bundle\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.912917 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-combined-ca-bundle\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.912969 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913019 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vdq8\" (UniqueName: \"kubernetes.io/projected/c135d6bf-430b-4fac-9b05-470df1e82e01-kube-api-access-5vdq8\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913045 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-scripts\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913074 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913109 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-log-httpd\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913181 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb95c\" (UniqueName: \"kubernetes.io/projected/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-kube-api-access-mb95c\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913209 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-run-httpd\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913259 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-db-sync-config-data\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913284 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-config-data\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913317 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2kkj\" (UniqueName: \"kubernetes.io/projected/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-kube-api-access-b2kkj\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-config\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913670 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-log-httpd\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.913829 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-run-httpd\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.914880 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-config-data\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.916874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.925368 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br8c4\" (UniqueName: \"kubernetes.io/projected/ce04c2cb-ee0d-4530-8007-a853f1d4e785-kube-api-access-br8c4\") pod \"cinder-db-sync-vh7wz\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.931062 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-scripts\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.933485 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-config-data\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.950878 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.958448 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.977492 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vdq8\" (UniqueName: \"kubernetes.io/projected/c135d6bf-430b-4fac-9b05-470df1e82e01-kube-api-access-5vdq8\") pod \"ceilometer-0\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " pod="openstack/ceilometer-0" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.989858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-w9ssn"] Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.990949 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.999283 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-27kg4" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.999629 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 09:33:13 crc kubenswrapper[4763]: I1201 09:33:13.999804 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017572 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2kkj\" (UniqueName: \"kubernetes.io/projected/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-kube-api-access-b2kkj\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017636 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-config\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017667 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-config-data\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017707 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-combined-ca-bundle\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017743 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-combined-ca-bundle\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017806 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-scripts\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017836 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzbg\" (UniqueName: \"kubernetes.io/projected/071c6297-c7b9-44fd-807a-43b881312f92-kube-api-access-hxzbg\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017864 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-combined-ca-bundle\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.017988 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071c6297-c7b9-44fd-807a-43b881312f92-logs\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.018013 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb95c\" (UniqueName: \"kubernetes.io/projected/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-kube-api-access-mb95c\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.018079 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-db-sync-config-data\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.018669 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-b25qm"] Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.063842 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-w9ssn"] Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.075876 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb95c\" (UniqueName: \"kubernetes.io/projected/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-kube-api-access-mb95c\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.075883 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-db-sync-config-data\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.076868 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-combined-ca-bundle\") pod \"barbican-db-sync-xhw7j\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.077637 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-config\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.079354 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-combined-ca-bundle\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.079893 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2kkj\" (UniqueName: \"kubernetes.io/projected/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-kube-api-access-b2kkj\") pod \"neutron-db-sync-v52zn\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.082715 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-phqt2"] Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.084561 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.098181 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.105401 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-phqt2"] Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124221 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wdx\" (UniqueName: \"kubernetes.io/projected/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-kube-api-access-z9wdx\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124273 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124301 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071c6297-c7b9-44fd-807a-43b881312f92-logs\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124466 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124576 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-config-data\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124666 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-config\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-scripts\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124776 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzbg\" (UniqueName: \"kubernetes.io/projected/071c6297-c7b9-44fd-807a-43b881312f92-kube-api-access-hxzbg\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-combined-ca-bundle\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.124860 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-dns-svc\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.125831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071c6297-c7b9-44fd-807a-43b881312f92-logs\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.131843 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-scripts\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.139057 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-config-data\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.143309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-combined-ca-bundle\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.158518 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzbg\" (UniqueName: \"kubernetes.io/projected/071c6297-c7b9-44fd-807a-43b881312f92-kube-api-access-hxzbg\") pod \"placement-db-sync-w9ssn\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.226129 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-config\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.226211 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-dns-svc\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.226265 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wdx\" (UniqueName: \"kubernetes.io/projected/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-kube-api-access-z9wdx\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.226292 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.226334 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.227231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.227787 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-config\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.228262 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-dns-svc\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.229060 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.262782 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wdx\" (UniqueName: \"kubernetes.io/projected/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-kube-api-access-z9wdx\") pod \"dnsmasq-dns-6bf59f66bf-phqt2\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.273583 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.341291 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.364751 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.417880 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.578587 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qfths"] Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.623023 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-b25qm"] Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.737602 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vh7wz"] Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.880702 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v52zn"] Dec 01 09:33:14 crc kubenswrapper[4763]: I1201 09:33:14.965759 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vh7wz" event={"ID":"ce04c2cb-ee0d-4530-8007-a853f1d4e785","Type":"ContainerStarted","Data":"5e90ccd37fe7424482a9de8a8ce346a967ff16065f056d3247230238515509f0"} Dec 01 09:33:15 crc kubenswrapper[4763]: I1201 09:33:15.023131 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfths" event={"ID":"45bce80f-f262-4afe-ab9c-47a638e02256","Type":"ContainerStarted","Data":"823f047047310044d2ce93b1d047f2bdd259301af1ba0c3ec59edaf43662811d"} Dec 01 09:33:15 crc kubenswrapper[4763]: I1201 09:33:15.030411 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" event={"ID":"4b2b04f8-dfab-4508-be76-658e400c4b7d","Type":"ContainerStarted","Data":"7cdb5c358b09d41b919ac4856108b6aa46b018c4d527c0f6b9b243edadafc21a"} Dec 01 09:33:15 crc kubenswrapper[4763]: I1201 09:33:15.179308 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:33:15 crc kubenswrapper[4763]: I1201 09:33:15.291479 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xhw7j"] Dec 01 09:33:15 crc kubenswrapper[4763]: W1201 09:33:15.313538 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652820eb_87dd_4c77_bef1_1bcd7e68fdf5.slice/crio-85f1c24c079619380fb74ae77606a1a46cc687a0e65785c062a0f2fe80868a69 WatchSource:0}: Error finding container 85f1c24c079619380fb74ae77606a1a46cc687a0e65785c062a0f2fe80868a69: Status 404 returned error can't find the container with id 85f1c24c079619380fb74ae77606a1a46cc687a0e65785c062a0f2fe80868a69 Dec 01 09:33:15 crc kubenswrapper[4763]: I1201 09:33:15.338999 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-w9ssn"] Dec 01 09:33:15 crc kubenswrapper[4763]: I1201 09:33:15.420834 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-phqt2"] Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.042588 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v52zn" event={"ID":"d785a5a5-a9b9-45f0-b36a-a66fd298bfac","Type":"ContainerStarted","Data":"6d66fd4d61f3b65a735486be113b3b53f62d8cf9bef78d43e65c60c726045a6f"} Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.044987 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w9ssn" event={"ID":"071c6297-c7b9-44fd-807a-43b881312f92","Type":"ContainerStarted","Data":"4c789fa977d09784d720eef7fb8c9667c8434aa02f9f2c57e314e70be33370d3"} Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.048961 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xhw7j" event={"ID":"652820eb-87dd-4c77-bef1-1bcd7e68fdf5","Type":"ContainerStarted","Data":"85f1c24c079619380fb74ae77606a1a46cc687a0e65785c062a0f2fe80868a69"} Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.051748 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" event={"ID":"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57","Type":"ContainerStarted","Data":"43bd522ff30657c70e6a981902a80940c993f65f426bded203f9c90ec5b6de03"} Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.060270 4763 generic.go:334] "Generic (PLEG): container finished" podID="4b2b04f8-dfab-4508-be76-658e400c4b7d" containerID="50e62302594665537a78b3fa69d9098c45bb7088c0604078234ac086db0bd1e1" exitCode=0 Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.060355 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" event={"ID":"4b2b04f8-dfab-4508-be76-658e400c4b7d","Type":"ContainerDied","Data":"50e62302594665537a78b3fa69d9098c45bb7088c0604078234ac086db0bd1e1"} Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.067100 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerStarted","Data":"6367e1a058667bc4ae1a865750a90585d9002adb4d27142a8bac502a46b5a485"} Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.072988 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfths" event={"ID":"45bce80f-f262-4afe-ab9c-47a638e02256","Type":"ContainerStarted","Data":"58f996d498c3eb25ac59916ca1a98387aa1f9cd1a7b24403dd0736eaffd3aa37"} Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.111023 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qfths" podStartSLOduration=3.111005519 podStartE2EDuration="3.111005519s" podCreationTimestamp="2025-12-01 09:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:16.101950234 +0000 UTC m=+1113.370599002" watchObservedRunningTime="2025-12-01 09:33:16.111005519 +0000 UTC m=+1113.379654287" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.497138 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.527340 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-nb\") pod \"4b2b04f8-dfab-4508-be76-658e400c4b7d\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.527464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-config\") pod \"4b2b04f8-dfab-4508-be76-658e400c4b7d\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.527517 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl7qh\" (UniqueName: \"kubernetes.io/projected/4b2b04f8-dfab-4508-be76-658e400c4b7d-kube-api-access-nl7qh\") pod \"4b2b04f8-dfab-4508-be76-658e400c4b7d\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.527615 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-dns-svc\") pod \"4b2b04f8-dfab-4508-be76-658e400c4b7d\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.527663 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-sb\") pod \"4b2b04f8-dfab-4508-be76-658e400c4b7d\" (UID: \"4b2b04f8-dfab-4508-be76-658e400c4b7d\") " Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.558263 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b2b04f8-dfab-4508-be76-658e400c4b7d" (UID: "4b2b04f8-dfab-4508-be76-658e400c4b7d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.569977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b2b04f8-dfab-4508-be76-658e400c4b7d-kube-api-access-nl7qh" (OuterVolumeSpecName: "kube-api-access-nl7qh") pod "4b2b04f8-dfab-4508-be76-658e400c4b7d" (UID: "4b2b04f8-dfab-4508-be76-658e400c4b7d"). InnerVolumeSpecName "kube-api-access-nl7qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.576070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-config" (OuterVolumeSpecName: "config") pod "4b2b04f8-dfab-4508-be76-658e400c4b7d" (UID: "4b2b04f8-dfab-4508-be76-658e400c4b7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.583643 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b2b04f8-dfab-4508-be76-658e400c4b7d" (UID: "4b2b04f8-dfab-4508-be76-658e400c4b7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.617355 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b2b04f8-dfab-4508-be76-658e400c4b7d" (UID: "4b2b04f8-dfab-4508-be76-658e400c4b7d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.630228 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.630260 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.630272 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl7qh\" (UniqueName: \"kubernetes.io/projected/4b2b04f8-dfab-4508-be76-658e400c4b7d-kube-api-access-nl7qh\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.630283 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.630293 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b2b04f8-dfab-4508-be76-658e400c4b7d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:16 crc kubenswrapper[4763]: I1201 09:33:16.696393 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.097571 4763 generic.go:334] "Generic (PLEG): container finished" podID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" containerID="7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24" exitCode=0 Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.097915 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" event={"ID":"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57","Type":"ContainerDied","Data":"7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24"} Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.107043 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" event={"ID":"4b2b04f8-dfab-4508-be76-658e400c4b7d","Type":"ContainerDied","Data":"7cdb5c358b09d41b919ac4856108b6aa46b018c4d527c0f6b9b243edadafc21a"} Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.107094 4763 scope.go:117] "RemoveContainer" containerID="50e62302594665537a78b3fa69d9098c45bb7088c0604078234ac086db0bd1e1" Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.107180 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fbd85b65-b25qm" Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.125096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v52zn" event={"ID":"d785a5a5-a9b9-45f0-b36a-a66fd298bfac","Type":"ContainerStarted","Data":"f1d6a1b7ba846a48f945ae146dddbcadfba7d1fcd6f8ecb750d417e7408b7cb2"} Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.284549 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-b25qm"] Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.296768 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-b25qm"] Dec 01 09:33:17 crc kubenswrapper[4763]: I1201 09:33:17.300279 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-v52zn" podStartSLOduration=4.300259911 podStartE2EDuration="4.300259911s" podCreationTimestamp="2025-12-01 09:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:17.216380046 +0000 UTC m=+1114.485028814" watchObservedRunningTime="2025-12-01 09:33:17.300259911 +0000 UTC m=+1114.568908679" Dec 01 09:33:18 crc kubenswrapper[4763]: I1201 09:33:18.143095 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" event={"ID":"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57","Type":"ContainerStarted","Data":"b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b"} Dec 01 09:33:18 crc kubenswrapper[4763]: I1201 09:33:18.143412 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:18 crc kubenswrapper[4763]: I1201 09:33:18.180846 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" podStartSLOduration=5.180830171 podStartE2EDuration="5.180830171s" podCreationTimestamp="2025-12-01 09:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:18.173934436 +0000 UTC m=+1115.442583214" watchObservedRunningTime="2025-12-01 09:33:18.180830171 +0000 UTC m=+1115.449478939" Dec 01 09:33:19 crc kubenswrapper[4763]: I1201 09:33:19.014729 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b2b04f8-dfab-4508-be76-658e400c4b7d" path="/var/lib/kubelet/pods/4b2b04f8-dfab-4508-be76-658e400c4b7d/volumes" Dec 01 09:33:21 crc kubenswrapper[4763]: I1201 09:33:21.196727 4763 generic.go:334] "Generic (PLEG): container finished" podID="5ffbeb86-fa1d-4382-b9eb-a51bea87c540" containerID="23c13ea7c64fcd259467f9887f5be246b537ba8010d5e7e0911f6f48251109d7" exitCode=0 Dec 01 09:33:21 crc kubenswrapper[4763]: I1201 09:33:21.196832 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-glb7w" event={"ID":"5ffbeb86-fa1d-4382-b9eb-a51bea87c540","Type":"ContainerDied","Data":"23c13ea7c64fcd259467f9887f5be246b537ba8010d5e7e0911f6f48251109d7"} Dec 01 09:33:21 crc kubenswrapper[4763]: I1201 09:33:21.202316 4763 generic.go:334] "Generic (PLEG): container finished" podID="45bce80f-f262-4afe-ab9c-47a638e02256" containerID="58f996d498c3eb25ac59916ca1a98387aa1f9cd1a7b24403dd0736eaffd3aa37" exitCode=0 Dec 01 09:33:21 crc kubenswrapper[4763]: I1201 09:33:21.202362 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfths" event={"ID":"45bce80f-f262-4afe-ab9c-47a638e02256","Type":"ContainerDied","Data":"58f996d498c3eb25ac59916ca1a98387aa1f9cd1a7b24403dd0736eaffd3aa37"} Dec 01 09:33:24 crc kubenswrapper[4763]: I1201 09:33:24.419533 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:24 crc kubenswrapper[4763]: I1201 09:33:24.486012 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-9zjzk"] Dec 01 09:33:24 crc kubenswrapper[4763]: I1201 09:33:24.487862 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-9zjzk" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="dnsmasq-dns" containerID="cri-o://9482be36431dba72697ac9e0096d950a6f488ccc4bd68cff51d62414922ca39e" gracePeriod=10 Dec 01 09:33:25 crc kubenswrapper[4763]: I1201 09:33:25.238269 4763 generic.go:334] "Generic (PLEG): container finished" podID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerID="9482be36431dba72697ac9e0096d950a6f488ccc4bd68cff51d62414922ca39e" exitCode=0 Dec 01 09:33:25 crc kubenswrapper[4763]: I1201 09:33:25.238338 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-9zjzk" event={"ID":"cf4df61c-41b4-42c6-bdc7-dca59452a919","Type":"ContainerDied","Data":"9482be36431dba72697ac9e0096d950a6f488ccc4bd68cff51d62414922ca39e"} Dec 01 09:33:28 crc kubenswrapper[4763]: E1201 09:33:28.535495 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 01 09:33:28 crc kubenswrapper[4763]: E1201 09:33:28.535992 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxzbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-w9ssn_openstack(071c6297-c7b9-44fd-807a-43b881312f92): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:33:28 crc kubenswrapper[4763]: E1201 09:33:28.538778 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-w9ssn" podUID="071c6297-c7b9-44fd-807a-43b881312f92" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.612856 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.764670 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-fernet-keys\") pod \"45bce80f-f262-4afe-ab9c-47a638e02256\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.764719 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-scripts\") pod \"45bce80f-f262-4afe-ab9c-47a638e02256\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.764771 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nbfc\" (UniqueName: \"kubernetes.io/projected/45bce80f-f262-4afe-ab9c-47a638e02256-kube-api-access-9nbfc\") pod \"45bce80f-f262-4afe-ab9c-47a638e02256\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.764799 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-config-data\") pod \"45bce80f-f262-4afe-ab9c-47a638e02256\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.764905 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-credential-keys\") pod \"45bce80f-f262-4afe-ab9c-47a638e02256\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.764952 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-combined-ca-bundle\") pod \"45bce80f-f262-4afe-ab9c-47a638e02256\" (UID: \"45bce80f-f262-4afe-ab9c-47a638e02256\") " Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.770205 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "45bce80f-f262-4afe-ab9c-47a638e02256" (UID: "45bce80f-f262-4afe-ab9c-47a638e02256"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.770420 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45bce80f-f262-4afe-ab9c-47a638e02256-kube-api-access-9nbfc" (OuterVolumeSpecName: "kube-api-access-9nbfc") pod "45bce80f-f262-4afe-ab9c-47a638e02256" (UID: "45bce80f-f262-4afe-ab9c-47a638e02256"). InnerVolumeSpecName "kube-api-access-9nbfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.788966 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-scripts" (OuterVolumeSpecName: "scripts") pod "45bce80f-f262-4afe-ab9c-47a638e02256" (UID: "45bce80f-f262-4afe-ab9c-47a638e02256"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.790386 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "45bce80f-f262-4afe-ab9c-47a638e02256" (UID: "45bce80f-f262-4afe-ab9c-47a638e02256"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.793597 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45bce80f-f262-4afe-ab9c-47a638e02256" (UID: "45bce80f-f262-4afe-ab9c-47a638e02256"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.799022 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-config-data" (OuterVolumeSpecName: "config-data") pod "45bce80f-f262-4afe-ab9c-47a638e02256" (UID: "45bce80f-f262-4afe-ab9c-47a638e02256"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.866408 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.866443 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.866466 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nbfc\" (UniqueName: \"kubernetes.io/projected/45bce80f-f262-4afe-ab9c-47a638e02256-kube-api-access-9nbfc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.866505 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.866518 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:28 crc kubenswrapper[4763]: I1201 09:33:28.866526 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bce80f-f262-4afe-ab9c-47a638e02256-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.279515 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qfths" event={"ID":"45bce80f-f262-4afe-ab9c-47a638e02256","Type":"ContainerDied","Data":"823f047047310044d2ce93b1d047f2bdd259301af1ba0c3ec59edaf43662811d"} Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.279818 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823f047047310044d2ce93b1d047f2bdd259301af1ba0c3ec59edaf43662811d" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.279560 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qfths" Dec 01 09:33:29 crc kubenswrapper[4763]: E1201 09:33:29.280672 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-w9ssn" podUID="071c6297-c7b9-44fd-807a-43b881312f92" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.736181 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qfths"] Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.744528 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qfths"] Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.855575 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j4mtm"] Dec 01 09:33:29 crc kubenswrapper[4763]: E1201 09:33:29.856350 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45bce80f-f262-4afe-ab9c-47a638e02256" containerName="keystone-bootstrap" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.856547 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="45bce80f-f262-4afe-ab9c-47a638e02256" containerName="keystone-bootstrap" Dec 01 09:33:29 crc kubenswrapper[4763]: E1201 09:33:29.856693 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b2b04f8-dfab-4508-be76-658e400c4b7d" containerName="init" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.856797 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b2b04f8-dfab-4508-be76-658e400c4b7d" containerName="init" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.857131 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b2b04f8-dfab-4508-be76-658e400c4b7d" containerName="init" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.857254 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="45bce80f-f262-4afe-ab9c-47a638e02256" containerName="keystone-bootstrap" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.858043 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.864548 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.865604 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.865992 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-w8tqv" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.866585 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.868582 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.884470 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j4mtm"] Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.986313 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdp24\" (UniqueName: \"kubernetes.io/projected/a9ba86b1-459e-422a-859e-95a9901d8d93-kube-api-access-zdp24\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.986661 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-credential-keys\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.986757 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-scripts\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.986817 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-combined-ca-bundle\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.986912 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-fernet-keys\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:29 crc kubenswrapper[4763]: I1201 09:33:29.986961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-config-data\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.088262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-credential-keys\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.088319 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-scripts\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.088339 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-combined-ca-bundle\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.088366 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-fernet-keys\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.088381 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-config-data\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.088433 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdp24\" (UniqueName: \"kubernetes.io/projected/a9ba86b1-459e-422a-859e-95a9901d8d93-kube-api-access-zdp24\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.094406 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-combined-ca-bundle\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.101872 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-credential-keys\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.102386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-fernet-keys\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.102931 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-config-data\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.103137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-scripts\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.105304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdp24\" (UniqueName: \"kubernetes.io/projected/a9ba86b1-459e-422a-859e-95a9901d8d93-kube-api-access-zdp24\") pod \"keystone-bootstrap-j4mtm\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:30 crc kubenswrapper[4763]: I1201 09:33:30.185231 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:31 crc kubenswrapper[4763]: I1201 09:33:31.005347 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45bce80f-f262-4afe-ab9c-47a638e02256" path="/var/lib/kubelet/pods/45bce80f-f262-4afe-ab9c-47a638e02256/volumes" Dec 01 09:33:31 crc kubenswrapper[4763]: I1201 09:33:31.440387 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-9zjzk" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 01 09:33:36 crc kubenswrapper[4763]: I1201 09:33:36.441608 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-9zjzk" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.645351 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.654200 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-glb7w" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729426 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-config\") pod \"cf4df61c-41b4-42c6-bdc7-dca59452a919\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729570 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-combined-ca-bundle\") pod \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729715 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-dns-svc\") pod \"cf4df61c-41b4-42c6-bdc7-dca59452a919\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729792 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-sb\") pod \"cf4df61c-41b4-42c6-bdc7-dca59452a919\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-db-sync-config-data\") pod \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729886 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-nb\") pod \"cf4df61c-41b4-42c6-bdc7-dca59452a919\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729906 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcjzx\" (UniqueName: \"kubernetes.io/projected/cf4df61c-41b4-42c6-bdc7-dca59452a919-kube-api-access-gcjzx\") pod \"cf4df61c-41b4-42c6-bdc7-dca59452a919\" (UID: \"cf4df61c-41b4-42c6-bdc7-dca59452a919\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd2dz\" (UniqueName: \"kubernetes.io/projected/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-kube-api-access-pd2dz\") pod \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.729971 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-config-data\") pod \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\" (UID: \"5ffbeb86-fa1d-4382-b9eb-a51bea87c540\") " Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.758347 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5ffbeb86-fa1d-4382-b9eb-a51bea87c540" (UID: "5ffbeb86-fa1d-4382-b9eb-a51bea87c540"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.821899 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4df61c-41b4-42c6-bdc7-dca59452a919-kube-api-access-gcjzx" (OuterVolumeSpecName: "kube-api-access-gcjzx") pod "cf4df61c-41b4-42c6-bdc7-dca59452a919" (UID: "cf4df61c-41b4-42c6-bdc7-dca59452a919"). InnerVolumeSpecName "kube-api-access-gcjzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.822490 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-kube-api-access-pd2dz" (OuterVolumeSpecName: "kube-api-access-pd2dz") pod "5ffbeb86-fa1d-4382-b9eb-a51bea87c540" (UID: "5ffbeb86-fa1d-4382-b9eb-a51bea87c540"). InnerVolumeSpecName "kube-api-access-pd2dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.831580 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.831618 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcjzx\" (UniqueName: \"kubernetes.io/projected/cf4df61c-41b4-42c6-bdc7-dca59452a919-kube-api-access-gcjzx\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.831630 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd2dz\" (UniqueName: \"kubernetes.io/projected/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-kube-api-access-pd2dz\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.833430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ffbeb86-fa1d-4382-b9eb-a51bea87c540" (UID: "5ffbeb86-fa1d-4382-b9eb-a51bea87c540"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.849984 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf4df61c-41b4-42c6-bdc7-dca59452a919" (UID: "cf4df61c-41b4-42c6-bdc7-dca59452a919"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.873812 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf4df61c-41b4-42c6-bdc7-dca59452a919" (UID: "cf4df61c-41b4-42c6-bdc7-dca59452a919"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.875647 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-config" (OuterVolumeSpecName: "config") pod "cf4df61c-41b4-42c6-bdc7-dca59452a919" (UID: "cf4df61c-41b4-42c6-bdc7-dca59452a919"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.877125 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf4df61c-41b4-42c6-bdc7-dca59452a919" (UID: "cf4df61c-41b4-42c6-bdc7-dca59452a919"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.887449 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-config-data" (OuterVolumeSpecName: "config-data") pod "5ffbeb86-fa1d-4382-b9eb-a51bea87c540" (UID: "5ffbeb86-fa1d-4382-b9eb-a51bea87c540"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.933250 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.933296 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.933306 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.933316 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.933326 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf4df61c-41b4-42c6-bdc7-dca59452a919-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:38 crc kubenswrapper[4763]: I1201 09:33:38.933337 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffbeb86-fa1d-4382-b9eb-a51bea87c540-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:39 crc kubenswrapper[4763]: E1201 09:33:39.224962 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ffbeb86_fa1d_4382_b9eb_a51bea87c540.slice/crio-43c759c12ca071bce95929427b8a5f6d9b5b53428915da9edacfedc2ea96237a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf4df61c_41b4_42c6_bdc7_dca59452a919.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf4df61c_41b4_42c6_bdc7_dca59452a919.slice/crio-bbc1402016a8f8c3d882b7dd595880044080c141dac87273be9eaa1e583e35a4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ffbeb86_fa1d_4382_b9eb_a51bea87c540.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:33:39 crc kubenswrapper[4763]: I1201 09:33:39.367684 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-glb7w" Dec 01 09:33:39 crc kubenswrapper[4763]: I1201 09:33:39.367673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-glb7w" event={"ID":"5ffbeb86-fa1d-4382-b9eb-a51bea87c540","Type":"ContainerDied","Data":"43c759c12ca071bce95929427b8a5f6d9b5b53428915da9edacfedc2ea96237a"} Dec 01 09:33:39 crc kubenswrapper[4763]: I1201 09:33:39.368199 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43c759c12ca071bce95929427b8a5f6d9b5b53428915da9edacfedc2ea96237a" Dec 01 09:33:39 crc kubenswrapper[4763]: I1201 09:33:39.370068 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-9zjzk" event={"ID":"cf4df61c-41b4-42c6-bdc7-dca59452a919","Type":"ContainerDied","Data":"bbc1402016a8f8c3d882b7dd595880044080c141dac87273be9eaa1e583e35a4"} Dec 01 09:33:39 crc kubenswrapper[4763]: I1201 09:33:39.370101 4763 scope.go:117] "RemoveContainer" containerID="9482be36431dba72697ac9e0096d950a6f488ccc4bd68cff51d62414922ca39e" Dec 01 09:33:39 crc kubenswrapper[4763]: I1201 09:33:39.370275 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-9zjzk" Dec 01 09:33:39 crc kubenswrapper[4763]: I1201 09:33:39.403662 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-9zjzk"] Dec 01 09:33:39 crc kubenswrapper[4763]: I1201 09:33:39.422675 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-9zjzk"] Dec 01 09:33:39 crc kubenswrapper[4763]: E1201 09:33:39.527870 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 01 09:33:39 crc kubenswrapper[4763]: E1201 09:33:39.528167 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mb95c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xhw7j_openstack(652820eb-87dd-4c77-bef1-1bcd7e68fdf5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:33:39 crc kubenswrapper[4763]: E1201 09:33:39.529518 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xhw7j" podUID="652820eb-87dd-4c77-bef1-1bcd7e68fdf5" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.184742 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-vbch4"] Dec 01 09:33:40 crc kubenswrapper[4763]: E1201 09:33:40.185233 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="init" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.185250 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="init" Dec 01 09:33:40 crc kubenswrapper[4763]: E1201 09:33:40.185267 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffbeb86-fa1d-4382-b9eb-a51bea87c540" containerName="glance-db-sync" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.185275 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffbeb86-fa1d-4382-b9eb-a51bea87c540" containerName="glance-db-sync" Dec 01 09:33:40 crc kubenswrapper[4763]: E1201 09:33:40.185295 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="dnsmasq-dns" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.185304 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="dnsmasq-dns" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.185523 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffbeb86-fa1d-4382-b9eb-a51bea87c540" containerName="glance-db-sync" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.185544 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="dnsmasq-dns" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.186590 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.202376 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-vbch4"] Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.263750 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.263818 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.263881 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.264046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z7h7\" (UniqueName: \"kubernetes.io/projected/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-kube-api-access-8z7h7\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.264077 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-config\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.365366 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.365417 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.365468 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.365549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z7h7\" (UniqueName: \"kubernetes.io/projected/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-kube-api-access-8z7h7\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.365589 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-config\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.366564 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-config\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.367099 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.368352 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.368802 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: E1201 09:33:40.387667 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xhw7j" podUID="652820eb-87dd-4c77-bef1-1bcd7e68fdf5" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.390016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z7h7\" (UniqueName: \"kubernetes.io/projected/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-kube-api-access-8z7h7\") pod \"dnsmasq-dns-5b6dbdb6f5-vbch4\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:40 crc kubenswrapper[4763]: I1201 09:33:40.529516 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:41 crc kubenswrapper[4763]: I1201 09:33:41.010127 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" path="/var/lib/kubelet/pods/cf4df61c-41b4-42c6-bdc7-dca59452a919/volumes" Dec 01 09:33:41 crc kubenswrapper[4763]: E1201 09:33:41.366049 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 01 09:33:41 crc kubenswrapper[4763]: E1201 09:33:41.366199 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-br8c4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vh7wz_openstack(ce04c2cb-ee0d-4530-8007-a853f1d4e785): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 09:33:41 crc kubenswrapper[4763]: E1201 09:33:41.368092 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vh7wz" podUID="ce04c2cb-ee0d-4530-8007-a853f1d4e785" Dec 01 09:33:41 crc kubenswrapper[4763]: I1201 09:33:41.379730 4763 scope.go:117] "RemoveContainer" containerID="f2073c15df3f1e669da2f08060c43a3fe29bb05313c3ec3ab3b4d5e771986cf7" Dec 01 09:33:41 crc kubenswrapper[4763]: E1201 09:33:41.410934 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-vh7wz" podUID="ce04c2cb-ee0d-4530-8007-a853f1d4e785" Dec 01 09:33:41 crc kubenswrapper[4763]: I1201 09:33:41.448605 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-9zjzk" podUID="cf4df61c-41b4-42c6-bdc7-dca59452a919" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.022446 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j4mtm"] Dec 01 09:33:42 crc kubenswrapper[4763]: W1201 09:33:42.026017 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c6253b2_59d7_4c69_8cc1_fb72e34df7f8.slice/crio-3310e93991d10aa0b9ddeefe0d5c81b75e2c97993fd2966837f85dd091ab38a8 WatchSource:0}: Error finding container 3310e93991d10aa0b9ddeefe0d5c81b75e2c97993fd2966837f85dd091ab38a8: Status 404 returned error can't find the container with id 3310e93991d10aa0b9ddeefe0d5c81b75e2c97993fd2966837f85dd091ab38a8 Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.034911 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-vbch4"] Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.427241 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j4mtm" event={"ID":"a9ba86b1-459e-422a-859e-95a9901d8d93","Type":"ContainerStarted","Data":"8ca12f3f037c2e21aecaaa07ea094c46dfb42a47fc7e7a6985db3c5500df52cc"} Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.429855 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j4mtm" event={"ID":"a9ba86b1-459e-422a-859e-95a9901d8d93","Type":"ContainerStarted","Data":"d48a9c4fe0f2e7e78f92641e48e10941b631fbde8566195eaa33fbf9c542f8cc"} Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.447958 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerStarted","Data":"6cf3dbbf7a2911884595a44b39b0c998675e710d51136c4870051374ae74091b"} Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.457838 4763 generic.go:334] "Generic (PLEG): container finished" podID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" containerID="7f3e1743e75aee9ed4f3696fac52492f405e118b0dc40abb627a9f15ad92e0a4" exitCode=0 Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.457886 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" event={"ID":"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8","Type":"ContainerDied","Data":"7f3e1743e75aee9ed4f3696fac52492f405e118b0dc40abb627a9f15ad92e0a4"} Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.457935 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" event={"ID":"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8","Type":"ContainerStarted","Data":"3310e93991d10aa0b9ddeefe0d5c81b75e2c97993fd2966837f85dd091ab38a8"} Dec 01 09:33:42 crc kubenswrapper[4763]: I1201 09:33:42.458227 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j4mtm" podStartSLOduration=13.458215952 podStartE2EDuration="13.458215952s" podCreationTimestamp="2025-12-01 09:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:42.445220286 +0000 UTC m=+1139.713869054" watchObservedRunningTime="2025-12-01 09:33:42.458215952 +0000 UTC m=+1139.726864720" Dec 01 09:33:43 crc kubenswrapper[4763]: I1201 09:33:43.487934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" event={"ID":"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8","Type":"ContainerStarted","Data":"8b8b16a07e005baea974092f60cbde27cee57acbc96c22aaa1da0e2ff2acd1a3"} Dec 01 09:33:43 crc kubenswrapper[4763]: I1201 09:33:43.488751 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:43 crc kubenswrapper[4763]: I1201 09:33:43.519108 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" podStartSLOduration=3.519081985 podStartE2EDuration="3.519081985s" podCreationTimestamp="2025-12-01 09:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:43.510485313 +0000 UTC m=+1140.779134091" watchObservedRunningTime="2025-12-01 09:33:43.519081985 +0000 UTC m=+1140.787730753" Dec 01 09:33:44 crc kubenswrapper[4763]: I1201 09:33:44.499331 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerStarted","Data":"3ed39fb8f70ee1b9dc1423f185906c7d882af9de3f9c86e9bb5328fcaf35cb32"} Dec 01 09:33:48 crc kubenswrapper[4763]: I1201 09:33:48.570776 4763 generic.go:334] "Generic (PLEG): container finished" podID="a9ba86b1-459e-422a-859e-95a9901d8d93" containerID="8ca12f3f037c2e21aecaaa07ea094c46dfb42a47fc7e7a6985db3c5500df52cc" exitCode=0 Dec 01 09:33:48 crc kubenswrapper[4763]: I1201 09:33:48.570957 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j4mtm" event={"ID":"a9ba86b1-459e-422a-859e-95a9901d8d93","Type":"ContainerDied","Data":"8ca12f3f037c2e21aecaaa07ea094c46dfb42a47fc7e7a6985db3c5500df52cc"} Dec 01 09:33:48 crc kubenswrapper[4763]: I1201 09:33:48.575618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w9ssn" event={"ID":"071c6297-c7b9-44fd-807a-43b881312f92","Type":"ContainerStarted","Data":"e933b4f6e7d82e8340a2085be91679a971ad5780f8e2701a180ef7c495cf8b2a"} Dec 01 09:33:48 crc kubenswrapper[4763]: I1201 09:33:48.623095 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-w9ssn" podStartSLOduration=3.073330277 podStartE2EDuration="35.623078211s" podCreationTimestamp="2025-12-01 09:33:13 +0000 UTC" firstStartedPulling="2025-12-01 09:33:15.394565084 +0000 UTC m=+1112.663213852" lastFinishedPulling="2025-12-01 09:33:47.944313018 +0000 UTC m=+1145.212961786" observedRunningTime="2025-12-01 09:33:48.620703634 +0000 UTC m=+1145.889352412" watchObservedRunningTime="2025-12-01 09:33:48.623078211 +0000 UTC m=+1145.891726979" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.484345 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.551589 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.588816 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-combined-ca-bundle\") pod \"a9ba86b1-459e-422a-859e-95a9901d8d93\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.588872 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-scripts\") pod \"a9ba86b1-459e-422a-859e-95a9901d8d93\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.588893 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-config-data\") pod \"a9ba86b1-459e-422a-859e-95a9901d8d93\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.588983 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdp24\" (UniqueName: \"kubernetes.io/projected/a9ba86b1-459e-422a-859e-95a9901d8d93-kube-api-access-zdp24\") pod \"a9ba86b1-459e-422a-859e-95a9901d8d93\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.588999 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-credential-keys\") pod \"a9ba86b1-459e-422a-859e-95a9901d8d93\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.589093 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-fernet-keys\") pod \"a9ba86b1-459e-422a-859e-95a9901d8d93\" (UID: \"a9ba86b1-459e-422a-859e-95a9901d8d93\") " Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.611703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a9ba86b1-459e-422a-859e-95a9901d8d93" (UID: "a9ba86b1-459e-422a-859e-95a9901d8d93"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.612556 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9ba86b1-459e-422a-859e-95a9901d8d93-kube-api-access-zdp24" (OuterVolumeSpecName: "kube-api-access-zdp24") pod "a9ba86b1-459e-422a-859e-95a9901d8d93" (UID: "a9ba86b1-459e-422a-859e-95a9901d8d93"). InnerVolumeSpecName "kube-api-access-zdp24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.620372 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-scripts" (OuterVolumeSpecName: "scripts") pod "a9ba86b1-459e-422a-859e-95a9901d8d93" (UID: "a9ba86b1-459e-422a-859e-95a9901d8d93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.621058 4763 generic.go:334] "Generic (PLEG): container finished" podID="071c6297-c7b9-44fd-807a-43b881312f92" containerID="e933b4f6e7d82e8340a2085be91679a971ad5780f8e2701a180ef7c495cf8b2a" exitCode=0 Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.621122 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w9ssn" event={"ID":"071c6297-c7b9-44fd-807a-43b881312f92","Type":"ContainerDied","Data":"e933b4f6e7d82e8340a2085be91679a971ad5780f8e2701a180ef7c495cf8b2a"} Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.621566 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a9ba86b1-459e-422a-859e-95a9901d8d93" (UID: "a9ba86b1-459e-422a-859e-95a9901d8d93"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.622692 4763 generic.go:334] "Generic (PLEG): container finished" podID="d785a5a5-a9b9-45f0-b36a-a66fd298bfac" containerID="f1d6a1b7ba846a48f945ae146dddbcadfba7d1fcd6f8ecb750d417e7408b7cb2" exitCode=0 Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.622733 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v52zn" event={"ID":"d785a5a5-a9b9-45f0-b36a-a66fd298bfac","Type":"ContainerDied","Data":"f1d6a1b7ba846a48f945ae146dddbcadfba7d1fcd6f8ecb750d417e7408b7cb2"} Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.637282 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j4mtm" event={"ID":"a9ba86b1-459e-422a-859e-95a9901d8d93","Type":"ContainerDied","Data":"d48a9c4fe0f2e7e78f92641e48e10941b631fbde8566195eaa33fbf9c542f8cc"} Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.637527 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d48a9c4fe0f2e7e78f92641e48e10941b631fbde8566195eaa33fbf9c542f8cc" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.637690 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j4mtm" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.675055 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-phqt2"] Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.675642 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" podUID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" containerName="dnsmasq-dns" containerID="cri-o://b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b" gracePeriod=10 Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.684070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-config-data" (OuterVolumeSpecName: "config-data") pod "a9ba86b1-459e-422a-859e-95a9901d8d93" (UID: "a9ba86b1-459e-422a-859e-95a9901d8d93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.691506 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdp24\" (UniqueName: \"kubernetes.io/projected/a9ba86b1-459e-422a-859e-95a9901d8d93-kube-api-access-zdp24\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.691539 4763 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.691548 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.691556 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.691564 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.697001 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9ba86b1-459e-422a-859e-95a9901d8d93" (UID: "a9ba86b1-459e-422a-859e-95a9901d8d93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.805906 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ba86b1-459e-422a-859e-95a9901d8d93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.839209 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7cddbbbc75-sb8z7"] Dec 01 09:33:50 crc kubenswrapper[4763]: E1201 09:33:50.839788 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9ba86b1-459e-422a-859e-95a9901d8d93" containerName="keystone-bootstrap" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.839854 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9ba86b1-459e-422a-859e-95a9901d8d93" containerName="keystone-bootstrap" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.840077 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9ba86b1-459e-422a-859e-95a9901d8d93" containerName="keystone-bootstrap" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.841968 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.849297 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.849315 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.853597 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cddbbbc75-sb8z7"] Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.907397 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-config-data\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.907491 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcslg\" (UniqueName: \"kubernetes.io/projected/f503ba10-c7e4-4615-9137-8138e0dfb3f9-kube-api-access-rcslg\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.907530 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-scripts\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.907575 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-combined-ca-bundle\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.907602 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-internal-tls-certs\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.907630 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-fernet-keys\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.907676 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-public-tls-certs\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:50 crc kubenswrapper[4763]: I1201 09:33:50.907707 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-credential-keys\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.010320 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-config-data\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.010416 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcslg\" (UniqueName: \"kubernetes.io/projected/f503ba10-c7e4-4615-9137-8138e0dfb3f9-kube-api-access-rcslg\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.010444 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-scripts\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.010509 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-combined-ca-bundle\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.010531 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-internal-tls-certs\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.010575 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-fernet-keys\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.010615 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-public-tls-certs\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.015205 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-credential-keys\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.019402 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-internal-tls-certs\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.019625 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-combined-ca-bundle\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.019996 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-config-data\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.020326 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-fernet-keys\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.021199 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-public-tls-certs\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.024696 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-credential-keys\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.034309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f503ba10-c7e4-4615-9137-8138e0dfb3f9-scripts\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.040766 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcslg\" (UniqueName: \"kubernetes.io/projected/f503ba10-c7e4-4615-9137-8138e0dfb3f9-kube-api-access-rcslg\") pod \"keystone-7cddbbbc75-sb8z7\" (UID: \"f503ba10-c7e4-4615-9137-8138e0dfb3f9\") " pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.173454 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.175012 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.319789 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-sb\") pod \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.320328 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-dns-svc\") pod \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.320430 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-nb\") pod \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.320602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9wdx\" (UniqueName: \"kubernetes.io/projected/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-kube-api-access-z9wdx\") pod \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.320701 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-config\") pod \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\" (UID: \"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57\") " Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.355854 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-kube-api-access-z9wdx" (OuterVolumeSpecName: "kube-api-access-z9wdx") pod "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" (UID: "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57"). InnerVolumeSpecName "kube-api-access-z9wdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.395958 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" (UID: "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.397887 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" (UID: "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.422389 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" (UID: "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.424624 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.424748 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.424814 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.424880 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9wdx\" (UniqueName: \"kubernetes.io/projected/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-kube-api-access-z9wdx\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.437655 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-config" (OuterVolumeSpecName: "config") pod "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" (UID: "d8f9f997-dfe4-452a-8c7e-e7521ca5ce57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.526768 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.648594 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cddbbbc75-sb8z7"] Dec 01 09:33:51 crc kubenswrapper[4763]: W1201 09:33:51.651982 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf503ba10_c7e4_4615_9137_8138e0dfb3f9.slice/crio-367cf89b8d49e2fd5ceb410b700344955ad0cb537f676f68082a994b480c4711 WatchSource:0}: Error finding container 367cf89b8d49e2fd5ceb410b700344955ad0cb537f676f68082a994b480c4711: Status 404 returned error can't find the container with id 367cf89b8d49e2fd5ceb410b700344955ad0cb537f676f68082a994b480c4711 Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.652797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerStarted","Data":"9135b60634a809faa34954b213b43cd22625896aed679407b9f15baf45bd4da5"} Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.654850 4763 generic.go:334] "Generic (PLEG): container finished" podID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" containerID="b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b" exitCode=0 Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.655147 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.655654 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" event={"ID":"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57","Type":"ContainerDied","Data":"b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b"} Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.655770 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-phqt2" event={"ID":"d8f9f997-dfe4-452a-8c7e-e7521ca5ce57","Type":"ContainerDied","Data":"43bd522ff30657c70e6a981902a80940c993f65f426bded203f9c90ec5b6de03"} Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.655898 4763 scope.go:117] "RemoveContainer" containerID="b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.728610 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-phqt2"] Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.737519 4763 scope.go:117] "RemoveContainer" containerID="7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.744670 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-phqt2"] Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.776367 4763 scope.go:117] "RemoveContainer" containerID="b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b" Dec 01 09:33:51 crc kubenswrapper[4763]: E1201 09:33:51.778737 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b\": container with ID starting with b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b not found: ID does not exist" containerID="b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.778806 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b"} err="failed to get container status \"b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b\": rpc error: code = NotFound desc = could not find container \"b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b\": container with ID starting with b96c37a77a56ad404199c8a2962303c69b79435c846a17d0b35dc68be14c1c0b not found: ID does not exist" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.778843 4763 scope.go:117] "RemoveContainer" containerID="7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24" Dec 01 09:33:51 crc kubenswrapper[4763]: E1201 09:33:51.779527 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24\": container with ID starting with 7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24 not found: ID does not exist" containerID="7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24" Dec 01 09:33:51 crc kubenswrapper[4763]: I1201 09:33:51.779552 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24"} err="failed to get container status \"7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24\": rpc error: code = NotFound desc = could not find container \"7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24\": container with ID starting with 7555f5dfc56ef50a21e89a7cc5f0b1451f4e9f606c52733684d64d569b499e24 not found: ID does not exist" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.189034 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.298446 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.354676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-combined-ca-bundle\") pod \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.355325 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-config\") pod \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.355481 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2kkj\" (UniqueName: \"kubernetes.io/projected/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-kube-api-access-b2kkj\") pod \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\" (UID: \"d785a5a5-a9b9-45f0-b36a-a66fd298bfac\") " Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.372070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-kube-api-access-b2kkj" (OuterVolumeSpecName: "kube-api-access-b2kkj") pod "d785a5a5-a9b9-45f0-b36a-a66fd298bfac" (UID: "d785a5a5-a9b9-45f0-b36a-a66fd298bfac"). InnerVolumeSpecName "kube-api-access-b2kkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.394628 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d785a5a5-a9b9-45f0-b36a-a66fd298bfac" (UID: "d785a5a5-a9b9-45f0-b36a-a66fd298bfac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.394760 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-config" (OuterVolumeSpecName: "config") pod "d785a5a5-a9b9-45f0-b36a-a66fd298bfac" (UID: "d785a5a5-a9b9-45f0-b36a-a66fd298bfac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.457019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-scripts\") pod \"071c6297-c7b9-44fd-807a-43b881312f92\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.457089 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-config-data\") pod \"071c6297-c7b9-44fd-807a-43b881312f92\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.457146 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071c6297-c7b9-44fd-807a-43b881312f92-logs\") pod \"071c6297-c7b9-44fd-807a-43b881312f92\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.457244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxzbg\" (UniqueName: \"kubernetes.io/projected/071c6297-c7b9-44fd-807a-43b881312f92-kube-api-access-hxzbg\") pod \"071c6297-c7b9-44fd-807a-43b881312f92\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.457291 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-combined-ca-bundle\") pod \"071c6297-c7b9-44fd-807a-43b881312f92\" (UID: \"071c6297-c7b9-44fd-807a-43b881312f92\") " Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.458103 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.458122 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2kkj\" (UniqueName: \"kubernetes.io/projected/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-kube-api-access-b2kkj\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.458132 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d785a5a5-a9b9-45f0-b36a-a66fd298bfac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.460436 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071c6297-c7b9-44fd-807a-43b881312f92-logs" (OuterVolumeSpecName: "logs") pod "071c6297-c7b9-44fd-807a-43b881312f92" (UID: "071c6297-c7b9-44fd-807a-43b881312f92"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.465676 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071c6297-c7b9-44fd-807a-43b881312f92-kube-api-access-hxzbg" (OuterVolumeSpecName: "kube-api-access-hxzbg") pod "071c6297-c7b9-44fd-807a-43b881312f92" (UID: "071c6297-c7b9-44fd-807a-43b881312f92"). InnerVolumeSpecName "kube-api-access-hxzbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.467988 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-scripts" (OuterVolumeSpecName: "scripts") pod "071c6297-c7b9-44fd-807a-43b881312f92" (UID: "071c6297-c7b9-44fd-807a-43b881312f92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.504645 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "071c6297-c7b9-44fd-807a-43b881312f92" (UID: "071c6297-c7b9-44fd-807a-43b881312f92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.540342 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-config-data" (OuterVolumeSpecName: "config-data") pod "071c6297-c7b9-44fd-807a-43b881312f92" (UID: "071c6297-c7b9-44fd-807a-43b881312f92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.559888 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.560096 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.560195 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071c6297-c7b9-44fd-807a-43b881312f92-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.560252 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxzbg\" (UniqueName: \"kubernetes.io/projected/071c6297-c7b9-44fd-807a-43b881312f92-kube-api-access-hxzbg\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.560304 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071c6297-c7b9-44fd-807a-43b881312f92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.666728 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w9ssn" event={"ID":"071c6297-c7b9-44fd-807a-43b881312f92","Type":"ContainerDied","Data":"4c789fa977d09784d720eef7fb8c9667c8434aa02f9f2c57e314e70be33370d3"} Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.666777 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c789fa977d09784d720eef7fb8c9667c8434aa02f9f2c57e314e70be33370d3" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.666845 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w9ssn" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.687952 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v52zn" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.688128 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v52zn" event={"ID":"d785a5a5-a9b9-45f0-b36a-a66fd298bfac","Type":"ContainerDied","Data":"6d66fd4d61f3b65a735486be113b3b53f62d8cf9bef78d43e65c60c726045a6f"} Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.688181 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d66fd4d61f3b65a735486be113b3b53f62d8cf9bef78d43e65c60c726045a6f" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.691129 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cddbbbc75-sb8z7" event={"ID":"f503ba10-c7e4-4615-9137-8138e0dfb3f9","Type":"ContainerStarted","Data":"808021a23511f24d8adbad072c28efac27a431a12aa55ab321ae793f5e3ff4b5"} Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.691171 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cddbbbc75-sb8z7" event={"ID":"f503ba10-c7e4-4615-9137-8138e0dfb3f9","Type":"ContainerStarted","Data":"367cf89b8d49e2fd5ceb410b700344955ad0cb537f676f68082a994b480c4711"} Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.691330 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.716964 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7cddbbbc75-sb8z7" podStartSLOduration=2.716943304 podStartE2EDuration="2.716943304s" podCreationTimestamp="2025-12-01 09:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:52.710276356 +0000 UTC m=+1149.978925124" watchObservedRunningTime="2025-12-01 09:33:52.716943304 +0000 UTC m=+1149.985592072" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.820549 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bbd79555b-kk8vr"] Dec 01 09:33:52 crc kubenswrapper[4763]: E1201 09:33:52.820896 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" containerName="init" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.820909 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" containerName="init" Dec 01 09:33:52 crc kubenswrapper[4763]: E1201 09:33:52.820923 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" containerName="dnsmasq-dns" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.820931 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" containerName="dnsmasq-dns" Dec 01 09:33:52 crc kubenswrapper[4763]: E1201 09:33:52.820954 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071c6297-c7b9-44fd-807a-43b881312f92" containerName="placement-db-sync" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.820960 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="071c6297-c7b9-44fd-807a-43b881312f92" containerName="placement-db-sync" Dec 01 09:33:52 crc kubenswrapper[4763]: E1201 09:33:52.820969 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d785a5a5-a9b9-45f0-b36a-a66fd298bfac" containerName="neutron-db-sync" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.820978 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d785a5a5-a9b9-45f0-b36a-a66fd298bfac" containerName="neutron-db-sync" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.821157 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d785a5a5-a9b9-45f0-b36a-a66fd298bfac" containerName="neutron-db-sync" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.821169 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="071c6297-c7b9-44fd-807a-43b881312f92" containerName="placement-db-sync" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.821176 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" containerName="dnsmasq-dns" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.824500 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.830679 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-27kg4" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.831527 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.832765 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.832785 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.832918 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.835625 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bbd79555b-kk8vr"] Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.968551 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-combined-ca-bundle\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.968708 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-scripts\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.968755 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-config-data\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.968849 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-public-tls-certs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.968969 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpbfw\" (UniqueName: \"kubernetes.io/projected/cdaab12f-7433-420e-bdf0-99ef2e2f5707-kube-api-access-lpbfw\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.969021 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-internal-tls-certs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:52 crc kubenswrapper[4763]: I1201 09:33:52.969125 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdaab12f-7433-420e-bdf0-99ef2e2f5707-logs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.040412 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f9f997-dfe4-452a-8c7e-e7521ca5ce57" path="/var/lib/kubelet/pods/d8f9f997-dfe4-452a-8c7e-e7521ca5ce57/volumes" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.043097 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-cv6kb"] Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.057013 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.076437 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdaab12f-7433-420e-bdf0-99ef2e2f5707-logs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.076625 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-combined-ca-bundle\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.076718 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-scripts\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.076772 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-config-data\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.076808 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-public-tls-certs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.076847 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpbfw\" (UniqueName: \"kubernetes.io/projected/cdaab12f-7433-420e-bdf0-99ef2e2f5707-kube-api-access-lpbfw\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.076884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-internal-tls-certs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.084271 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdaab12f-7433-420e-bdf0-99ef2e2f5707-logs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.101052 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-combined-ca-bundle\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.105686 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-config-data\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.120046 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-scripts\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.121934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-public-tls-certs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.122332 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdaab12f-7433-420e-bdf0-99ef2e2f5707-internal-tls-certs\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.136622 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpbfw\" (UniqueName: \"kubernetes.io/projected/cdaab12f-7433-420e-bdf0-99ef2e2f5707-kube-api-access-lpbfw\") pod \"placement-5bbd79555b-kk8vr\" (UID: \"cdaab12f-7433-420e-bdf0-99ef2e2f5707\") " pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.179272 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.179373 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-config\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.179400 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drkqn\" (UniqueName: \"kubernetes.io/projected/e54f7054-aade-43c0-86b7-5338d093463a-kube-api-access-drkqn\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.179513 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.179539 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.200049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.211570 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-cv6kb"] Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.235994 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d79d596fb-f8qcf"] Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.241063 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.249209 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.249343 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vj8x6" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.249364 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.249448 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.257687 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d79d596fb-f8qcf"] Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.281531 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.281601 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.282438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.282785 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.283521 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.292760 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.293060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-config\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.293166 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drkqn\" (UniqueName: \"kubernetes.io/projected/e54f7054-aade-43c0-86b7-5338d093463a-kube-api-access-drkqn\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.294139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-config\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.317491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drkqn\" (UniqueName: \"kubernetes.io/projected/e54f7054-aade-43c0-86b7-5338d093463a-kube-api-access-drkqn\") pod \"dnsmasq-dns-5f66db59b9-cv6kb\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.395349 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-combined-ca-bundle\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.395656 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-config\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.395717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvddj\" (UniqueName: \"kubernetes.io/projected/5b735fb9-9fbf-4d03-9b87-e6d57634813a-kube-api-access-qvddj\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.395759 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-ovndb-tls-certs\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.395821 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-httpd-config\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.449985 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.497304 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-httpd-config\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.497371 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-combined-ca-bundle\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.497410 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-config\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.497474 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvddj\" (UniqueName: \"kubernetes.io/projected/5b735fb9-9fbf-4d03-9b87-e6d57634813a-kube-api-access-qvddj\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.497505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-ovndb-tls-certs\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.509251 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-ovndb-tls-certs\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.511846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-httpd-config\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.518168 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-config\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.526754 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-combined-ca-bundle\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.530079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvddj\" (UniqueName: \"kubernetes.io/projected/5b735fb9-9fbf-4d03-9b87-e6d57634813a-kube-api-access-qvddj\") pod \"neutron-5d79d596fb-f8qcf\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.571769 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:53 crc kubenswrapper[4763]: I1201 09:33:53.826888 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bbd79555b-kk8vr"] Dec 01 09:33:53 crc kubenswrapper[4763]: W1201 09:33:53.840750 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdaab12f_7433_420e_bdf0_99ef2e2f5707.slice/crio-08af72e87e3f17714739197673b6978d86cc2d2021de863b591e6990bf15a547 WatchSource:0}: Error finding container 08af72e87e3f17714739197673b6978d86cc2d2021de863b591e6990bf15a547: Status 404 returned error can't find the container with id 08af72e87e3f17714739197673b6978d86cc2d2021de863b591e6990bf15a547 Dec 01 09:33:54 crc kubenswrapper[4763]: I1201 09:33:54.022932 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-cv6kb"] Dec 01 09:33:54 crc kubenswrapper[4763]: I1201 09:33:54.418311 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d79d596fb-f8qcf"] Dec 01 09:33:54 crc kubenswrapper[4763]: W1201 09:33:54.431921 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b735fb9_9fbf_4d03_9b87_e6d57634813a.slice/crio-c5333fbb295544ff584f064f8b4b80e7421cc48beed1a35467edd56b22f1e52b WatchSource:0}: Error finding container c5333fbb295544ff584f064f8b4b80e7421cc48beed1a35467edd56b22f1e52b: Status 404 returned error can't find the container with id c5333fbb295544ff584f064f8b4b80e7421cc48beed1a35467edd56b22f1e52b Dec 01 09:33:54 crc kubenswrapper[4763]: I1201 09:33:54.720477 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" event={"ID":"e54f7054-aade-43c0-86b7-5338d093463a","Type":"ContainerStarted","Data":"3f55e315acf1e84451c4d0f1640190f762ab978b61a218e5a96832930de5c769"} Dec 01 09:33:54 crc kubenswrapper[4763]: I1201 09:33:54.722252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bbd79555b-kk8vr" event={"ID":"cdaab12f-7433-420e-bdf0-99ef2e2f5707","Type":"ContainerStarted","Data":"08af72e87e3f17714739197673b6978d86cc2d2021de863b591e6990bf15a547"} Dec 01 09:33:54 crc kubenswrapper[4763]: I1201 09:33:54.723806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d79d596fb-f8qcf" event={"ID":"5b735fb9-9fbf-4d03-9b87-e6d57634813a","Type":"ContainerStarted","Data":"c5333fbb295544ff584f064f8b4b80e7421cc48beed1a35467edd56b22f1e52b"} Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.752201 4763 generic.go:334] "Generic (PLEG): container finished" podID="e54f7054-aade-43c0-86b7-5338d093463a" containerID="509253a83b272760ff83bef8e71b01ef7f4f66cc2a222ca986db3f484956e4aa" exitCode=0 Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.752822 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" event={"ID":"e54f7054-aade-43c0-86b7-5338d093463a","Type":"ContainerDied","Data":"509253a83b272760ff83bef8e71b01ef7f4f66cc2a222ca986db3f484956e4aa"} Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.763625 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bbd79555b-kk8vr" event={"ID":"cdaab12f-7433-420e-bdf0-99ef2e2f5707","Type":"ContainerStarted","Data":"96233a0f6c0b4c489ffa50c336dc6e759864cc8e11b51b9fed36969887f7a07f"} Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.763679 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bbd79555b-kk8vr" event={"ID":"cdaab12f-7433-420e-bdf0-99ef2e2f5707","Type":"ContainerStarted","Data":"8603f9aaa38e37f7eea954e4ff6de4b5aeebcd1e8c18e93794a9298336eb5606"} Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.764376 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.764395 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.767713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d79d596fb-f8qcf" event={"ID":"5b735fb9-9fbf-4d03-9b87-e6d57634813a","Type":"ContainerStarted","Data":"835d0dbca7e4a633961e6837fbc3d659420f7e5fb545c50f64b71d9f99650e2b"} Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.767765 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d79d596fb-f8qcf" event={"ID":"5b735fb9-9fbf-4d03-9b87-e6d57634813a","Type":"ContainerStarted","Data":"1e992a86e6515341c22c2de89a92b32a7629cb525b534d1e4af94a02200cf768"} Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.767961 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.783168 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54896b4dfc-stxgl"] Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.791712 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.802313 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.802913 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.862296 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54896b4dfc-stxgl"] Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.876849 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d79d596fb-f8qcf" podStartSLOduration=2.876824521 podStartE2EDuration="2.876824521s" podCreationTimestamp="2025-12-01 09:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:55.841060443 +0000 UTC m=+1153.109709221" watchObservedRunningTime="2025-12-01 09:33:55.876824521 +0000 UTC m=+1153.145473289" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.918037 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5bbd79555b-kk8vr" podStartSLOduration=3.918004272 podStartE2EDuration="3.918004272s" podCreationTimestamp="2025-12-01 09:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:55.882909282 +0000 UTC m=+1153.151558050" watchObservedRunningTime="2025-12-01 09:33:55.918004272 +0000 UTC m=+1153.186653040" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.975740 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-combined-ca-bundle\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.975842 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-public-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.975888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-httpd-config\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.978444 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vh7\" (UniqueName: \"kubernetes.io/projected/e30619d4-84f7-4a40-aca1-b6885d608e03-kube-api-access-q2vh7\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.978529 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-config\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.978571 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-ovndb-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:55 crc kubenswrapper[4763]: I1201 09:33:55.978712 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-internal-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.083215 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vh7\" (UniqueName: \"kubernetes.io/projected/e30619d4-84f7-4a40-aca1-b6885d608e03-kube-api-access-q2vh7\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.083569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-config\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.083609 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-ovndb-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.083658 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-internal-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.083694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-combined-ca-bundle\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.083755 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-public-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.083808 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-httpd-config\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.089765 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-internal-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.091305 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-combined-ca-bundle\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.094039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-config\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.094068 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-ovndb-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.094887 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-httpd-config\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.098055 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e30619d4-84f7-4a40-aca1-b6885d608e03-public-tls-certs\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.111742 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vh7\" (UniqueName: \"kubernetes.io/projected/e30619d4-84f7-4a40-aca1-b6885d608e03-kube-api-access-q2vh7\") pod \"neutron-54896b4dfc-stxgl\" (UID: \"e30619d4-84f7-4a40-aca1-b6885d608e03\") " pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.147333 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.780128 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" event={"ID":"e54f7054-aade-43c0-86b7-5338d093463a","Type":"ContainerStarted","Data":"b75d0c3be6ced7a2f6c56e2417b325e3bf167200900857ebac092edf62ef58d8"} Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.780592 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.783122 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xhw7j" event={"ID":"652820eb-87dd-4c77-bef1-1bcd7e68fdf5","Type":"ContainerStarted","Data":"8d2459adeba909601ab768047c986e689580ce0271262b8f4a8796ed4dd3b86a"} Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.790968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vh7wz" event={"ID":"ce04c2cb-ee0d-4530-8007-a853f1d4e785","Type":"ContainerStarted","Data":"4c938982466fb132cad880ab6c290acd16f9ee48487cdeee008ca5f4196ac52a"} Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.805610 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" podStartSLOduration=4.8055883900000005 podStartE2EDuration="4.80558839s" podCreationTimestamp="2025-12-01 09:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:33:56.801739681 +0000 UTC m=+1154.070388449" watchObservedRunningTime="2025-12-01 09:33:56.80558839 +0000 UTC m=+1154.074237158" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.826913 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vh7wz" podStartSLOduration=3.30514694 podStartE2EDuration="43.8268958s" podCreationTimestamp="2025-12-01 09:33:13 +0000 UTC" firstStartedPulling="2025-12-01 09:33:14.761102939 +0000 UTC m=+1112.029751707" lastFinishedPulling="2025-12-01 09:33:55.282851799 +0000 UTC m=+1152.551500567" observedRunningTime="2025-12-01 09:33:56.821980511 +0000 UTC m=+1154.090629289" watchObservedRunningTime="2025-12-01 09:33:56.8268958 +0000 UTC m=+1154.095544568" Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.845938 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xhw7j" podStartSLOduration=3.6430179860000003 podStartE2EDuration="43.845918057s" podCreationTimestamp="2025-12-01 09:33:13 +0000 UTC" firstStartedPulling="2025-12-01 09:33:15.337078615 +0000 UTC m=+1112.605727383" lastFinishedPulling="2025-12-01 09:33:55.539978686 +0000 UTC m=+1152.808627454" observedRunningTime="2025-12-01 09:33:56.84178294 +0000 UTC m=+1154.110431708" watchObservedRunningTime="2025-12-01 09:33:56.845918057 +0000 UTC m=+1154.114566825" Dec 01 09:33:56 crc kubenswrapper[4763]: W1201 09:33:56.853627 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode30619d4_84f7_4a40_aca1_b6885d608e03.slice/crio-d58f925fdedc9f624c565e1965b216a51eb9187304a536baf8a30e40e0eb724e WatchSource:0}: Error finding container d58f925fdedc9f624c565e1965b216a51eb9187304a536baf8a30e40e0eb724e: Status 404 returned error can't find the container with id d58f925fdedc9f624c565e1965b216a51eb9187304a536baf8a30e40e0eb724e Dec 01 09:33:56 crc kubenswrapper[4763]: I1201 09:33:56.862539 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54896b4dfc-stxgl"] Dec 01 09:33:57 crc kubenswrapper[4763]: I1201 09:33:57.799758 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54896b4dfc-stxgl" event={"ID":"e30619d4-84f7-4a40-aca1-b6885d608e03","Type":"ContainerStarted","Data":"d58f925fdedc9f624c565e1965b216a51eb9187304a536baf8a30e40e0eb724e"} Dec 01 09:33:59 crc kubenswrapper[4763]: I1201 09:33:59.824179 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54896b4dfc-stxgl" event={"ID":"e30619d4-84f7-4a40-aca1-b6885d608e03","Type":"ContainerStarted","Data":"35d94b66b3254cbc9a91ecabbeb556e7699a3c01e98215118e6ef9ffbfaf24f5"} Dec 01 09:34:03 crc kubenswrapper[4763]: I1201 09:34:03.452336 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:34:03 crc kubenswrapper[4763]: I1201 09:34:03.517133 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-vbch4"] Dec 01 09:34:03 crc kubenswrapper[4763]: I1201 09:34:03.517350 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" podUID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" containerName="dnsmasq-dns" containerID="cri-o://8b8b16a07e005baea974092f60cbde27cee57acbc96c22aaa1da0e2ff2acd1a3" gracePeriod=10 Dec 01 09:34:03 crc kubenswrapper[4763]: I1201 09:34:03.866800 4763 generic.go:334] "Generic (PLEG): container finished" podID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" containerID="8b8b16a07e005baea974092f60cbde27cee57acbc96c22aaa1da0e2ff2acd1a3" exitCode=0 Dec 01 09:34:03 crc kubenswrapper[4763]: I1201 09:34:03.866845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" event={"ID":"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8","Type":"ContainerDied","Data":"8b8b16a07e005baea974092f60cbde27cee57acbc96c22aaa1da0e2ff2acd1a3"} Dec 01 09:34:04 crc kubenswrapper[4763]: I1201 09:34:04.919037 4763 generic.go:334] "Generic (PLEG): container finished" podID="652820eb-87dd-4c77-bef1-1bcd7e68fdf5" containerID="8d2459adeba909601ab768047c986e689580ce0271262b8f4a8796ed4dd3b86a" exitCode=0 Dec 01 09:34:04 crc kubenswrapper[4763]: I1201 09:34:04.919618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xhw7j" event={"ID":"652820eb-87dd-4c77-bef1-1bcd7e68fdf5","Type":"ContainerDied","Data":"8d2459adeba909601ab768047c986e689580ce0271262b8f4a8796ed4dd3b86a"} Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.216726 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.302089 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z7h7\" (UniqueName: \"kubernetes.io/projected/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-kube-api-access-8z7h7\") pod \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.302350 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-dns-svc\") pod \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.302548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-nb\") pod \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.302639 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-sb\") pod \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.302756 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-config\") pod \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\" (UID: \"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8\") " Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.311422 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-kube-api-access-8z7h7" (OuterVolumeSpecName: "kube-api-access-8z7h7") pod "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" (UID: "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8"). InnerVolumeSpecName "kube-api-access-8z7h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.351607 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-config" (OuterVolumeSpecName: "config") pod "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" (UID: "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.361165 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" (UID: "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.368812 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" (UID: "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.388683 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" (UID: "2c6253b2-59d7-4c69-8cc1-fb72e34df7f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.404613 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.404643 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z7h7\" (UniqueName: \"kubernetes.io/projected/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-kube-api-access-8z7h7\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.404654 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.404666 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.404674 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.929768 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54896b4dfc-stxgl" event={"ID":"e30619d4-84f7-4a40-aca1-b6885d608e03","Type":"ContainerStarted","Data":"e6d7d120fc7856ce6ec168c2edf4aa4361e60ea7718c6bf22cac3765ea21d2be"} Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.930647 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.940313 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.941055 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-vbch4" event={"ID":"2c6253b2-59d7-4c69-8cc1-fb72e34df7f8","Type":"ContainerDied","Data":"3310e93991d10aa0b9ddeefe0d5c81b75e2c97993fd2966837f85dd091ab38a8"} Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.941119 4763 scope.go:117] "RemoveContainer" containerID="8b8b16a07e005baea974092f60cbde27cee57acbc96c22aaa1da0e2ff2acd1a3" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.983451 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54896b4dfc-stxgl" podStartSLOduration=10.983427614 podStartE2EDuration="10.983427614s" podCreationTimestamp="2025-12-01 09:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:05.95204341 +0000 UTC m=+1163.220692198" watchObservedRunningTime="2025-12-01 09:34:05.983427614 +0000 UTC m=+1163.252076382" Dec 01 09:34:05 crc kubenswrapper[4763]: I1201 09:34:05.995214 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-vbch4"] Dec 01 09:34:06 crc kubenswrapper[4763]: I1201 09:34:06.009032 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-vbch4"] Dec 01 09:34:07 crc kubenswrapper[4763]: I1201 09:34:07.006794 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" path="/var/lib/kubelet/pods/2c6253b2-59d7-4c69-8cc1-fb72e34df7f8/volumes" Dec 01 09:34:08 crc kubenswrapper[4763]: I1201 09:34:08.808175 4763 scope.go:117] "RemoveContainer" containerID="7f3e1743e75aee9ed4f3696fac52492f405e118b0dc40abb627a9f15ad92e0a4" Dec 01 09:34:08 crc kubenswrapper[4763]: I1201 09:34:08.983717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xhw7j" event={"ID":"652820eb-87dd-4c77-bef1-1bcd7e68fdf5","Type":"ContainerDied","Data":"85f1c24c079619380fb74ae77606a1a46cc687a0e65785c062a0f2fe80868a69"} Dec 01 09:34:08 crc kubenswrapper[4763]: I1201 09:34:08.983968 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85f1c24c079619380fb74ae77606a1a46cc687a0e65785c062a0f2fe80868a69" Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.009477 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.066238 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-db-sync-config-data\") pod \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.066595 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb95c\" (UniqueName: \"kubernetes.io/projected/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-kube-api-access-mb95c\") pod \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.066646 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-combined-ca-bundle\") pod \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\" (UID: \"652820eb-87dd-4c77-bef1-1bcd7e68fdf5\") " Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.073188 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-kube-api-access-mb95c" (OuterVolumeSpecName: "kube-api-access-mb95c") pod "652820eb-87dd-4c77-bef1-1bcd7e68fdf5" (UID: "652820eb-87dd-4c77-bef1-1bcd7e68fdf5"). InnerVolumeSpecName "kube-api-access-mb95c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.073499 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "652820eb-87dd-4c77-bef1-1bcd7e68fdf5" (UID: "652820eb-87dd-4c77-bef1-1bcd7e68fdf5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.095695 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "652820eb-87dd-4c77-bef1-1bcd7e68fdf5" (UID: "652820eb-87dd-4c77-bef1-1bcd7e68fdf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.169097 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.169146 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb95c\" (UniqueName: \"kubernetes.io/projected/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-kube-api-access-mb95c\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.169162 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652820eb-87dd-4c77-bef1-1bcd7e68fdf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:09 crc kubenswrapper[4763]: I1201 09:34:09.993129 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xhw7j" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.002580 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="ceilometer-central-agent" containerID="cri-o://6cf3dbbf7a2911884595a44b39b0c998675e710d51136c4870051374ae74091b" gracePeriod=30 Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.006773 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerStarted","Data":"0065ad64b1673ef3fda346b29896e6309c7c991dd8db8d311f18222ce2cda70b"} Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.006998 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.007279 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="proxy-httpd" containerID="cri-o://0065ad64b1673ef3fda346b29896e6309c7c991dd8db8d311f18222ce2cda70b" gracePeriod=30 Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.007353 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="sg-core" containerID="cri-o://9135b60634a809faa34954b213b43cd22625896aed679407b9f15baf45bd4da5" gracePeriod=30 Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.007387 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="ceilometer-notification-agent" containerID="cri-o://3ed39fb8f70ee1b9dc1423f185906c7d882af9de3f9c86e9bb5328fcaf35cb32" gracePeriod=30 Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.304760 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.721005063 podStartE2EDuration="57.304734167s" podCreationTimestamp="2025-12-01 09:33:13 +0000 UTC" firstStartedPulling="2025-12-01 09:33:15.246716018 +0000 UTC m=+1112.515364786" lastFinishedPulling="2025-12-01 09:34:08.830445122 +0000 UTC m=+1166.099093890" observedRunningTime="2025-12-01 09:34:10.064819846 +0000 UTC m=+1167.333468614" watchObservedRunningTime="2025-12-01 09:34:10.304734167 +0000 UTC m=+1167.573382935" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.307101 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6948ddcbd7-94xjg"] Dec 01 09:34:10 crc kubenswrapper[4763]: E1201 09:34:10.307566 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" containerName="init" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.307630 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" containerName="init" Dec 01 09:34:10 crc kubenswrapper[4763]: E1201 09:34:10.307689 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652820eb-87dd-4c77-bef1-1bcd7e68fdf5" containerName="barbican-db-sync" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.307736 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="652820eb-87dd-4c77-bef1-1bcd7e68fdf5" containerName="barbican-db-sync" Dec 01 09:34:10 crc kubenswrapper[4763]: E1201 09:34:10.307794 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" containerName="dnsmasq-dns" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.307853 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" containerName="dnsmasq-dns" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.308125 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6253b2-59d7-4c69-8cc1-fb72e34df7f8" containerName="dnsmasq-dns" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.308203 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="652820eb-87dd-4c77-bef1-1bcd7e68fdf5" containerName="barbican-db-sync" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.309181 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.314470 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.314659 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dfh4h" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.314488 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.330519 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7db8c7446b-kkdjv"] Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.332392 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.337353 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.348512 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6948ddcbd7-94xjg"] Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.371355 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7db8c7446b-kkdjv"] Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.400495 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-combined-ca-bundle\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.400542 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-config-data-custom\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.400664 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-logs\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.400768 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-config-data\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.400858 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz6rs\" (UniqueName: \"kubernetes.io/projected/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-kube-api-access-wz6rs\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: E1201 09:34:10.425505 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc135d6bf_430b_4fac_9b05_470df1e82e01.slice/crio-9135b60634a809faa34954b213b43cd22625896aed679407b9f15baf45bd4da5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc135d6bf_430b_4fac_9b05_470df1e82e01.slice/crio-conmon-9135b60634a809faa34954b213b43cd22625896aed679407b9f15baf45bd4da5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc135d6bf_430b_4fac_9b05_470df1e82e01.slice/crio-0065ad64b1673ef3fda346b29896e6309c7c991dd8db8d311f18222ce2cda70b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652820eb_87dd_4c77_bef1_1bcd7e68fdf5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod652820eb_87dd_4c77_bef1_1bcd7e68fdf5.slice/crio-85f1c24c079619380fb74ae77606a1a46cc687a0e65785c062a0f2fe80868a69\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc135d6bf_430b_4fac_9b05_470df1e82e01.slice/crio-conmon-0065ad64b1673ef3fda346b29896e6309c7c991dd8db8d311f18222ce2cda70b.scope\": RecentStats: unable to find data in memory cache]" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-config-data\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a3e2134-8fc4-4ec2-9970-92959ed1778e-logs\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz6rs\" (UniqueName: \"kubernetes.io/projected/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-kube-api-access-wz6rs\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506704 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-config-data\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-combined-ca-bundle\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506761 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-config-data-custom\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506787 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-combined-ca-bundle\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506809 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-config-data-custom\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506840 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lwqs\" (UniqueName: \"kubernetes.io/projected/2a3e2134-8fc4-4ec2-9970-92959ed1778e-kube-api-access-2lwqs\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.506889 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-logs\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.507355 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-logs\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.542771 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869f779d85-f82xm"] Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.544432 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.549991 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-config-data\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.558304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-combined-ca-bundle\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.562385 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-f82xm"] Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.578132 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-config-data-custom\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.592625 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz6rs\" (UniqueName: \"kubernetes.io/projected/9dac2a9d-f2f8-4167-8f68-f01c9364a59f-kube-api-access-wz6rs\") pod \"barbican-worker-6948ddcbd7-94xjg\" (UID: \"9dac2a9d-f2f8-4167-8f68-f01c9364a59f\") " pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.621347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lwqs\" (UniqueName: \"kubernetes.io/projected/2a3e2134-8fc4-4ec2-9970-92959ed1778e-kube-api-access-2lwqs\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.621495 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a3e2134-8fc4-4ec2-9970-92959ed1778e-logs\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.621520 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-config-data\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.621548 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-combined-ca-bundle\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.621567 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-config-data-custom\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.622119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a3e2134-8fc4-4ec2-9970-92959ed1778e-logs\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.625144 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-combined-ca-bundle\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.630992 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-config-data-custom\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.635280 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a3e2134-8fc4-4ec2-9970-92959ed1778e-config-data\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.655663 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lwqs\" (UniqueName: \"kubernetes.io/projected/2a3e2134-8fc4-4ec2-9970-92959ed1778e-kube-api-access-2lwqs\") pod \"barbican-keystone-listener-7db8c7446b-kkdjv\" (UID: \"2a3e2134-8fc4-4ec2-9970-92959ed1778e\") " pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.656563 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-798f56bcd6-5xxds"] Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.660958 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.669718 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.671850 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6948ddcbd7-94xjg" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.699102 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.700144 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-798f56bcd6-5xxds"] Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.728301 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.728343 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.728390 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-dns-svc\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.728421 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-config\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.728485 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bndj\" (UniqueName: \"kubernetes.io/projected/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-kube-api-access-6bndj\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.829720 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lpmt\" (UniqueName: \"kubernetes.io/projected/b94d6350-7336-4142-8909-bb3c5e09412f-kube-api-access-9lpmt\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830104 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-dns-svc\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830137 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data-custom\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830192 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-config\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830229 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bndj\" (UniqueName: \"kubernetes.io/projected/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-kube-api-access-6bndj\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830366 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b94d6350-7336-4142-8909-bb3c5e09412f-logs\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.830599 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-combined-ca-bundle\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.834317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-dns-svc\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.835170 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.835894 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.836432 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-config\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.863220 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bndj\" (UniqueName: \"kubernetes.io/projected/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-kube-api-access-6bndj\") pod \"dnsmasq-dns-869f779d85-f82xm\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.933581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b94d6350-7336-4142-8909-bb3c5e09412f-logs\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.933776 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-combined-ca-bundle\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.933860 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lpmt\" (UniqueName: \"kubernetes.io/projected/b94d6350-7336-4142-8909-bb3c5e09412f-kube-api-access-9lpmt\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.933930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data-custom\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.934008 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.936625 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b94d6350-7336-4142-8909-bb3c5e09412f-logs\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.942421 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data-custom\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.946524 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-combined-ca-bundle\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.950884 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.968792 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:10 crc kubenswrapper[4763]: I1201 09:34:10.968904 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lpmt\" (UniqueName: \"kubernetes.io/projected/b94d6350-7336-4142-8909-bb3c5e09412f-kube-api-access-9lpmt\") pod \"barbican-api-798f56bcd6-5xxds\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.009327 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.021481 4763 generic.go:334] "Generic (PLEG): container finished" podID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerID="0065ad64b1673ef3fda346b29896e6309c7c991dd8db8d311f18222ce2cda70b" exitCode=0 Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.021516 4763 generic.go:334] "Generic (PLEG): container finished" podID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerID="9135b60634a809faa34954b213b43cd22625896aed679407b9f15baf45bd4da5" exitCode=2 Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.021536 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerDied","Data":"0065ad64b1673ef3fda346b29896e6309c7c991dd8db8d311f18222ce2cda70b"} Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.021561 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerDied","Data":"9135b60634a809faa34954b213b43cd22625896aed679407b9f15baf45bd4da5"} Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.339621 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7db8c7446b-kkdjv"] Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.483368 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6948ddcbd7-94xjg"] Dec 01 09:34:11 crc kubenswrapper[4763]: W1201 09:34:11.491354 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dac2a9d_f2f8_4167_8f68_f01c9364a59f.slice/crio-1ee313b19210e489b24d87db1e30c48b44284ef92501c22f8dbcb38321c8f826 WatchSource:0}: Error finding container 1ee313b19210e489b24d87db1e30c48b44284ef92501c22f8dbcb38321c8f826: Status 404 returned error can't find the container with id 1ee313b19210e489b24d87db1e30c48b44284ef92501c22f8dbcb38321c8f826 Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.646671 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-f82xm"] Dec 01 09:34:11 crc kubenswrapper[4763]: I1201 09:34:11.801406 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-798f56bcd6-5xxds"] Dec 01 09:34:12 crc kubenswrapper[4763]: I1201 09:34:12.035636 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6948ddcbd7-94xjg" event={"ID":"9dac2a9d-f2f8-4167-8f68-f01c9364a59f","Type":"ContainerStarted","Data":"1ee313b19210e489b24d87db1e30c48b44284ef92501c22f8dbcb38321c8f826"} Dec 01 09:34:12 crc kubenswrapper[4763]: I1201 09:34:12.038092 4763 generic.go:334] "Generic (PLEG): container finished" podID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerID="6cf3dbbf7a2911884595a44b39b0c998675e710d51136c4870051374ae74091b" exitCode=0 Dec 01 09:34:12 crc kubenswrapper[4763]: I1201 09:34:12.038146 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerDied","Data":"6cf3dbbf7a2911884595a44b39b0c998675e710d51136c4870051374ae74091b"} Dec 01 09:34:12 crc kubenswrapper[4763]: I1201 09:34:12.042504 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-798f56bcd6-5xxds" event={"ID":"b94d6350-7336-4142-8909-bb3c5e09412f","Type":"ContainerStarted","Data":"9bea4edd035f74cca4a29915c45f1bc4ba4ec12c293ba206750f768695194013"} Dec 01 09:34:12 crc kubenswrapper[4763]: I1201 09:34:12.044302 4763 generic.go:334] "Generic (PLEG): container finished" podID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" containerID="45f1ba37454104587d60a43f4c850df340dad33075fa4904648d600ed2824050" exitCode=0 Dec 01 09:34:12 crc kubenswrapper[4763]: I1201 09:34:12.044490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-f82xm" event={"ID":"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f","Type":"ContainerDied","Data":"45f1ba37454104587d60a43f4c850df340dad33075fa4904648d600ed2824050"} Dec 01 09:34:12 crc kubenswrapper[4763]: I1201 09:34:12.044597 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-f82xm" event={"ID":"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f","Type":"ContainerStarted","Data":"92e994da55c997e6f8baea1b0d0129e76a1613e0fe91e7fbbd18f7cb29eac389"} Dec 01 09:34:12 crc kubenswrapper[4763]: I1201 09:34:12.048761 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" event={"ID":"2a3e2134-8fc4-4ec2-9970-92959ed1778e","Type":"ContainerStarted","Data":"f0cc8a3ebfd3a760de71d09b03c621e8c4b3dfadabadaa6d8d747fc681bcef61"} Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.062653 4763 generic.go:334] "Generic (PLEG): container finished" podID="ce04c2cb-ee0d-4530-8007-a853f1d4e785" containerID="4c938982466fb132cad880ab6c290acd16f9ee48487cdeee008ca5f4196ac52a" exitCode=0 Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.063145 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vh7wz" event={"ID":"ce04c2cb-ee0d-4530-8007-a853f1d4e785","Type":"ContainerDied","Data":"4c938982466fb132cad880ab6c290acd16f9ee48487cdeee008ca5f4196ac52a"} Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.065180 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-798f56bcd6-5xxds" event={"ID":"b94d6350-7336-4142-8909-bb3c5e09412f","Type":"ContainerStarted","Data":"eb45742bfd1f12811b5bffd38a92fc7bbd55ee195e72e69b8300303b10cdd6bb"} Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.065221 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-798f56bcd6-5xxds" event={"ID":"b94d6350-7336-4142-8909-bb3c5e09412f","Type":"ContainerStarted","Data":"8a7f3305afcda3d190dc2a73b96d745bd8c6ca19d45c076bfd46d6f14de55911"} Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.065286 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.065344 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.067658 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-f82xm" event={"ID":"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f","Type":"ContainerStarted","Data":"33214036a9d2350e99f7186c256dab419514b3c4aeaaa874d318e5c8651af031"} Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.067814 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.121532 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869f779d85-f82xm" podStartSLOduration=3.121511094 podStartE2EDuration="3.121511094s" podCreationTimestamp="2025-12-01 09:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:13.103049573 +0000 UTC m=+1170.371698351" watchObservedRunningTime="2025-12-01 09:34:13.121511094 +0000 UTC m=+1170.390159862" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.127097 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-798f56bcd6-5xxds" podStartSLOduration=3.127079931 podStartE2EDuration="3.127079931s" podCreationTimestamp="2025-12-01 09:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:13.118719975 +0000 UTC m=+1170.387368753" watchObservedRunningTime="2025-12-01 09:34:13.127079931 +0000 UTC m=+1170.395728699" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.529359 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d56769fd6-btjcl"] Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.530859 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.532925 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.534700 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.545267 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d56769fd6-btjcl"] Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.697888 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-combined-ca-bundle\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.697952 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8g26\" (UniqueName: \"kubernetes.io/projected/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-kube-api-access-n8g26\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.697986 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-internal-tls-certs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.698083 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-config-data\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.698218 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-logs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.698348 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-public-tls-certs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.698965 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-config-data-custom\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.800412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-internal-tls-certs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.800520 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-config-data\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.800549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-logs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.800590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-public-tls-certs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.800619 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-config-data-custom\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.800669 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-combined-ca-bundle\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.800692 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8g26\" (UniqueName: \"kubernetes.io/projected/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-kube-api-access-n8g26\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.802182 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-logs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.809665 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-combined-ca-bundle\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.809832 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-config-data-custom\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.810387 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-internal-tls-certs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.812532 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-config-data\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.828040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8g26\" (UniqueName: \"kubernetes.io/projected/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-kube-api-access-n8g26\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.831098 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2-public-tls-certs\") pod \"barbican-api-d56769fd6-btjcl\" (UID: \"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2\") " pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:13 crc kubenswrapper[4763]: I1201 09:34:13.846660 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:14 crc kubenswrapper[4763]: I1201 09:34:14.091839 4763 generic.go:334] "Generic (PLEG): container finished" podID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerID="3ed39fb8f70ee1b9dc1423f185906c7d882af9de3f9c86e9bb5328fcaf35cb32" exitCode=0 Dec 01 09:34:14 crc kubenswrapper[4763]: I1201 09:34:14.092017 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerDied","Data":"3ed39fb8f70ee1b9dc1423f185906c7d882af9de3f9c86e9bb5328fcaf35cb32"} Dec 01 09:34:14 crc kubenswrapper[4763]: I1201 09:34:14.856043 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:14 crc kubenswrapper[4763]: I1201 09:34:14.864785 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.023993 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-config-data\") pod \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024043 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-log-httpd\") pod \"c135d6bf-430b-4fac-9b05-470df1e82e01\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024076 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-db-sync-config-data\") pod \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce04c2cb-ee0d-4530-8007-a853f1d4e785-etc-machine-id\") pod \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-scripts\") pod \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024161 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-sg-core-conf-yaml\") pod \"c135d6bf-430b-4fac-9b05-470df1e82e01\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024256 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br8c4\" (UniqueName: \"kubernetes.io/projected/ce04c2cb-ee0d-4530-8007-a853f1d4e785-kube-api-access-br8c4\") pod \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024301 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-combined-ca-bundle\") pod \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\" (UID: \"ce04c2cb-ee0d-4530-8007-a853f1d4e785\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024340 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-scripts\") pod \"c135d6bf-430b-4fac-9b05-470df1e82e01\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024428 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-config-data\") pod \"c135d6bf-430b-4fac-9b05-470df1e82e01\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024489 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vdq8\" (UniqueName: \"kubernetes.io/projected/c135d6bf-430b-4fac-9b05-470df1e82e01-kube-api-access-5vdq8\") pod \"c135d6bf-430b-4fac-9b05-470df1e82e01\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024511 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-combined-ca-bundle\") pod \"c135d6bf-430b-4fac-9b05-470df1e82e01\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.024529 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-run-httpd\") pod \"c135d6bf-430b-4fac-9b05-470df1e82e01\" (UID: \"c135d6bf-430b-4fac-9b05-470df1e82e01\") " Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.025427 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c135d6bf-430b-4fac-9b05-470df1e82e01" (UID: "c135d6bf-430b-4fac-9b05-470df1e82e01"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.025804 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce04c2cb-ee0d-4530-8007-a853f1d4e785-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ce04c2cb-ee0d-4530-8007-a853f1d4e785" (UID: "ce04c2cb-ee0d-4530-8007-a853f1d4e785"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.026164 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c135d6bf-430b-4fac-9b05-470df1e82e01" (UID: "c135d6bf-430b-4fac-9b05-470df1e82e01"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.032995 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c135d6bf-430b-4fac-9b05-470df1e82e01-kube-api-access-5vdq8" (OuterVolumeSpecName: "kube-api-access-5vdq8") pod "c135d6bf-430b-4fac-9b05-470df1e82e01" (UID: "c135d6bf-430b-4fac-9b05-470df1e82e01"). InnerVolumeSpecName "kube-api-access-5vdq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.033097 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ce04c2cb-ee0d-4530-8007-a853f1d4e785" (UID: "ce04c2cb-ee0d-4530-8007-a853f1d4e785"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.035244 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-scripts" (OuterVolumeSpecName: "scripts") pod "c135d6bf-430b-4fac-9b05-470df1e82e01" (UID: "c135d6bf-430b-4fac-9b05-470df1e82e01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.037638 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-scripts" (OuterVolumeSpecName: "scripts") pod "ce04c2cb-ee0d-4530-8007-a853f1d4e785" (UID: "ce04c2cb-ee0d-4530-8007-a853f1d4e785"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.040370 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce04c2cb-ee0d-4530-8007-a853f1d4e785-kube-api-access-br8c4" (OuterVolumeSpecName: "kube-api-access-br8c4") pod "ce04c2cb-ee0d-4530-8007-a853f1d4e785" (UID: "ce04c2cb-ee0d-4530-8007-a853f1d4e785"). InnerVolumeSpecName "kube-api-access-br8c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.108688 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c135d6bf-430b-4fac-9b05-470df1e82e01" (UID: "c135d6bf-430b-4fac-9b05-470df1e82e01"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.115047 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce04c2cb-ee0d-4530-8007-a853f1d4e785" (UID: "ce04c2cb-ee0d-4530-8007-a853f1d4e785"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130378 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130425 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vdq8\" (UniqueName: \"kubernetes.io/projected/c135d6bf-430b-4fac-9b05-470df1e82e01-kube-api-access-5vdq8\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130475 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130487 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c135d6bf-430b-4fac-9b05-470df1e82e01-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130499 4763 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130510 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce04c2cb-ee0d-4530-8007-a853f1d4e785-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130520 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130531 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130541 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br8c4\" (UniqueName: \"kubernetes.io/projected/ce04c2cb-ee0d-4530-8007-a853f1d4e785-kube-api-access-br8c4\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.130553 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.142273 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vh7wz" event={"ID":"ce04c2cb-ee0d-4530-8007-a853f1d4e785","Type":"ContainerDied","Data":"5e90ccd37fe7424482a9de8a8ce346a967ff16065f056d3247230238515509f0"} Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.142325 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e90ccd37fe7424482a9de8a8ce346a967ff16065f056d3247230238515509f0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.142331 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vh7wz" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.145464 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c135d6bf-430b-4fac-9b05-470df1e82e01","Type":"ContainerDied","Data":"6367e1a058667bc4ae1a865750a90585d9002adb4d27142a8bac502a46b5a485"} Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.145636 4763 scope.go:117] "RemoveContainer" containerID="0065ad64b1673ef3fda346b29896e6309c7c991dd8db8d311f18222ce2cda70b" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.145790 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.190808 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c135d6bf-430b-4fac-9b05-470df1e82e01" (UID: "c135d6bf-430b-4fac-9b05-470df1e82e01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.207374 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-config-data" (OuterVolumeSpecName: "config-data") pod "ce04c2cb-ee0d-4530-8007-a853f1d4e785" (UID: "ce04c2cb-ee0d-4530-8007-a853f1d4e785"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.232020 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.232051 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce04c2cb-ee0d-4530-8007-a853f1d4e785-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.286749 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-config-data" (OuterVolumeSpecName: "config-data") pod "c135d6bf-430b-4fac-9b05-470df1e82e01" (UID: "c135d6bf-430b-4fac-9b05-470df1e82e01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.338992 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c135d6bf-430b-4fac-9b05-470df1e82e01-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.411880 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:15 crc kubenswrapper[4763]: E1201 09:34:15.412266 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce04c2cb-ee0d-4530-8007-a853f1d4e785" containerName="cinder-db-sync" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.412284 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce04c2cb-ee0d-4530-8007-a853f1d4e785" containerName="cinder-db-sync" Dec 01 09:34:15 crc kubenswrapper[4763]: E1201 09:34:15.412299 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="sg-core" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.412305 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="sg-core" Dec 01 09:34:15 crc kubenswrapper[4763]: E1201 09:34:15.412315 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="proxy-httpd" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.412321 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="proxy-httpd" Dec 01 09:34:15 crc kubenswrapper[4763]: E1201 09:34:15.412332 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="ceilometer-central-agent" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.412338 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="ceilometer-central-agent" Dec 01 09:34:15 crc kubenswrapper[4763]: E1201 09:34:15.412361 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="ceilometer-notification-agent" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.412367 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="ceilometer-notification-agent" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.418774 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="ceilometer-notification-agent" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.418853 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce04c2cb-ee0d-4530-8007-a853f1d4e785" containerName="cinder-db-sync" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.418866 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="proxy-httpd" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.418882 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="sg-core" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.418900 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" containerName="ceilometer-central-agent" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.420139 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.424518 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.437622 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.539105 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-f82xm"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.539503 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869f779d85-f82xm" podUID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" containerName="dnsmasq-dns" containerID="cri-o://33214036a9d2350e99f7186c256dab419514b3c4aeaaa874d318e5c8651af031" gracePeriod=10 Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.543658 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.543736 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.543791 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.543870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.543890 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b781378d-290f-4d35-b317-058edd3ae9ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.543957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4z7b\" (UniqueName: \"kubernetes.io/projected/b781378d-290f-4d35-b317-058edd3ae9ea-kube-api-access-x4z7b\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.572171 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.610565 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.637782 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.648916 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4z7b\" (UniqueName: \"kubernetes.io/projected/b781378d-290f-4d35-b317-058edd3ae9ea-kube-api-access-x4z7b\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.648992 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.649024 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.649056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.649113 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.649130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b781378d-290f-4d35-b317-058edd3ae9ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.649200 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b781378d-290f-4d35-b317-058edd3ae9ea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.653139 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-scripts\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.653508 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.658599 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.659151 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.661924 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.664324 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.665081 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.698039 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4z7b\" (UniqueName: \"kubernetes.io/projected/b781378d-290f-4d35-b317-058edd3ae9ea-kube-api-access-x4z7b\") pod \"cinder-scheduler-0\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.699333 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.712615 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-v6l7k"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.714421 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.735421 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-v6l7k"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.739713 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.752044 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.752113 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-scripts\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.752184 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.752226 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-run-httpd\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.752253 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-config-data\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.752273 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-log-httpd\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.752298 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnrch\" (UniqueName: \"kubernetes.io/projected/523216bb-3506-4d7e-a7f7-7a50af4d838e-kube-api-access-tnrch\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.849665 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.851084 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.855589 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856390 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-dns-svc\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856423 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856470 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-run-httpd\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856494 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljl4x\" (UniqueName: \"kubernetes.io/projected/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-kube-api-access-ljl4x\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856513 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-config-data\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856529 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-log-httpd\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856549 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnrch\" (UniqueName: \"kubernetes.io/projected/523216bb-3506-4d7e-a7f7-7a50af4d838e-kube-api-access-tnrch\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856594 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-config\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856620 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856656 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.856695 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-scripts\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.857522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-run-httpd\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.857766 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-log-httpd\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.867277 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.867899 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.868671 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-config-data\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.888233 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnrch\" (UniqueName: \"kubernetes.io/projected/523216bb-3506-4d7e-a7f7-7a50af4d838e-kube-api-access-tnrch\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.889771 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-scripts\") pod \"ceilometer-0\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " pod="openstack/ceilometer-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.903473 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964304 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-config\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964378 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964425 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xjcj\" (UniqueName: \"kubernetes.io/projected/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-kube-api-access-2xjcj\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964444 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964498 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964524 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-dns-svc\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljl4x\" (UniqueName: \"kubernetes.io/projected/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-kube-api-access-ljl4x\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964586 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964618 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-logs\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964636 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-scripts\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.964655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.965402 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-config\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.966070 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.966975 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.967686 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-dns-svc\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:15 crc kubenswrapper[4763]: I1201 09:34:15.984769 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljl4x\" (UniqueName: \"kubernetes.io/projected/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-kube-api-access-ljl4x\") pod \"dnsmasq-dns-58db5546cc-v6l7k\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.001891 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.056242 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.066373 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.066692 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-logs\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.066773 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-scripts\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.066843 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.066944 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.067054 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xjcj\" (UniqueName: \"kubernetes.io/projected/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-kube-api-access-2xjcj\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.067127 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.072129 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-logs\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.066490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.075885 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-scripts\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.076857 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.077655 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.078238 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.094494 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xjcj\" (UniqueName: \"kubernetes.io/projected/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-kube-api-access-2xjcj\") pod \"cinder-api-0\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.104252 4763 scope.go:117] "RemoveContainer" containerID="9135b60634a809faa34954b213b43cd22625896aed679407b9f15baf45bd4da5" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.168996 4763 generic.go:334] "Generic (PLEG): container finished" podID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" containerID="33214036a9d2350e99f7186c256dab419514b3c4aeaaa874d318e5c8651af031" exitCode=0 Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.169044 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-f82xm" event={"ID":"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f","Type":"ContainerDied","Data":"33214036a9d2350e99f7186c256dab419514b3c4aeaaa874d318e5c8651af031"} Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.242936 4763 scope.go:117] "RemoveContainer" containerID="3ed39fb8f70ee1b9dc1423f185906c7d882af9de3f9c86e9bb5328fcaf35cb32" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.287095 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.351012 4763 scope.go:117] "RemoveContainer" containerID="6cf3dbbf7a2911884595a44b39b0c998675e710d51136c4870051374ae74091b" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.493344 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.577490 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-sb\") pod \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.577626 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-dns-svc\") pod \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.577726 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bndj\" (UniqueName: \"kubernetes.io/projected/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-kube-api-access-6bndj\") pod \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.577749 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-config\") pod \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.577790 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-nb\") pod \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\" (UID: \"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f\") " Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.616582 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-kube-api-access-6bndj" (OuterVolumeSpecName: "kube-api-access-6bndj") pod "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" (UID: "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f"). InnerVolumeSpecName "kube-api-access-6bndj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.672542 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" (UID: "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.680392 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bndj\" (UniqueName: \"kubernetes.io/projected/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-kube-api-access-6bndj\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.680421 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.705406 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" (UID: "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.714722 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" (UID: "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.746759 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-config" (OuterVolumeSpecName: "config") pod "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" (UID: "c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.773351 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d56769fd6-btjcl"] Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.781750 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.781775 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.781784 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:16 crc kubenswrapper[4763]: I1201 09:34:16.947639 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:16 crc kubenswrapper[4763]: W1201 09:34:16.949404 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb781378d_290f_4d35_b317_058edd3ae9ea.slice/crio-6bedf941dd155076cbcf29c4ca5fa917e1d32894be027c4b6db3f98bd927e190 WatchSource:0}: Error finding container 6bedf941dd155076cbcf29c4ca5fa917e1d32894be027c4b6db3f98bd927e190: Status 404 returned error can't find the container with id 6bedf941dd155076cbcf29c4ca5fa917e1d32894be027c4b6db3f98bd927e190 Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.007698 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c135d6bf-430b-4fac-9b05-470df1e82e01" path="/var/lib/kubelet/pods/c135d6bf-430b-4fac-9b05-470df1e82e01/volumes" Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.096302 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-v6l7k"] Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.108624 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.213895 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.221009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" event={"ID":"6c4d1d3d-a555-47d6-ac42-29c65c3c0559","Type":"ContainerStarted","Data":"819960ded2e5ff3fd65dc320355dc6140e1167eed86a19a76120e7606e3cdf22"} Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.249134 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerStarted","Data":"f7ac24bfe146cc97258c4abd08a0bd099badc221a0a7b0f1e857cfc5cc5c55c9"} Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.274273 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" event={"ID":"2a3e2134-8fc4-4ec2-9970-92959ed1778e","Type":"ContainerStarted","Data":"f89da5f8083a8d21aa00e6804f969b0b65b84bdfc8e4c58c321a5444ee7795aa"} Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.305798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6948ddcbd7-94xjg" event={"ID":"9dac2a9d-f2f8-4167-8f68-f01c9364a59f","Type":"ContainerStarted","Data":"67185a3ff48860a96e42ffebf25bff94af57447a302f8ecf177008bb7df456cc"} Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.319109 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b781378d-290f-4d35-b317-058edd3ae9ea","Type":"ContainerStarted","Data":"6bedf941dd155076cbcf29c4ca5fa917e1d32894be027c4b6db3f98bd927e190"} Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.332014 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d56769fd6-btjcl" event={"ID":"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2","Type":"ContainerStarted","Data":"eceb0b66dcbc42279b6f108adfa60b5c973eae4a5a2c549067813f2a8cc929a5"} Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.346923 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6948ddcbd7-94xjg" podStartSLOduration=2.601143707 podStartE2EDuration="7.346896344s" podCreationTimestamp="2025-12-01 09:34:10 +0000 UTC" firstStartedPulling="2025-12-01 09:34:11.497275442 +0000 UTC m=+1168.765924220" lastFinishedPulling="2025-12-01 09:34:16.243028089 +0000 UTC m=+1173.511676857" observedRunningTime="2025-12-01 09:34:17.335385799 +0000 UTC m=+1174.604034567" watchObservedRunningTime="2025-12-01 09:34:17.346896344 +0000 UTC m=+1174.615545112" Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.362541 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" podStartSLOduration=2.46647514 podStartE2EDuration="7.362518184s" podCreationTimestamp="2025-12-01 09:34:10 +0000 UTC" firstStartedPulling="2025-12-01 09:34:11.346978215 +0000 UTC m=+1168.615626983" lastFinishedPulling="2025-12-01 09:34:16.243021259 +0000 UTC m=+1173.511670027" observedRunningTime="2025-12-01 09:34:17.306765482 +0000 UTC m=+1174.575414250" watchObservedRunningTime="2025-12-01 09:34:17.362518184 +0000 UTC m=+1174.631166972" Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.372112 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-f82xm" event={"ID":"c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f","Type":"ContainerDied","Data":"92e994da55c997e6f8baea1b0d0129e76a1613e0fe91e7fbbd18f7cb29eac389"} Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.372544 4763 scope.go:117] "RemoveContainer" containerID="33214036a9d2350e99f7186c256dab419514b3c4aeaaa874d318e5c8651af031" Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.372482 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-f82xm" Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.413497 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-f82xm"] Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.427124 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-f82xm"] Dec 01 09:34:17 crc kubenswrapper[4763]: I1201 09:34:17.483904 4763 scope.go:117] "RemoveContainer" containerID="45f1ba37454104587d60a43f4c850df340dad33075fa4904648d600ed2824050" Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.408585 4763 generic.go:334] "Generic (PLEG): container finished" podID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" containerID="a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934" exitCode=0 Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.409080 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" event={"ID":"6c4d1d3d-a555-47d6-ac42-29c65c3c0559","Type":"ContainerDied","Data":"a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934"} Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.414752 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerStarted","Data":"3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08"} Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.417835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7db8c7446b-kkdjv" event={"ID":"2a3e2134-8fc4-4ec2-9970-92959ed1778e","Type":"ContainerStarted","Data":"49b5d3e481c55fc7cb90995feb6f0287c037f7949b8698f54176a5673492bb5c"} Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.430096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6948ddcbd7-94xjg" event={"ID":"9dac2a9d-f2f8-4167-8f68-f01c9364a59f","Type":"ContainerStarted","Data":"1327a85022dbf1f5d031fcc4ffafd717008f3dc9112a904c65987f6da81f0300"} Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.436798 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d56769fd6-btjcl" event={"ID":"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2","Type":"ContainerStarted","Data":"654a3b9e93106608951828c8bd41f0ac9f5a74179aad1bbc0508ea1bcf37545f"} Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.437055 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d56769fd6-btjcl" event={"ID":"6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2","Type":"ContainerStarted","Data":"b1a8ff39ec29ebb5429ed8b3867db6fed15b7a7ce01ff97c51b5973a14c8a089"} Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.437293 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.437507 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.447361 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a15a5c8-5af9-47e2-985f-c98efcdc46f9","Type":"ContainerStarted","Data":"bcd5fcdfe95f45cae71c5d6552851a09d6e31d8d100038847d2395d949f12747"} Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.637719 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d56769fd6-btjcl" podStartSLOduration=5.637697889 podStartE2EDuration="5.637697889s" podCreationTimestamp="2025-12-01 09:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:18.504040641 +0000 UTC m=+1175.772689409" watchObservedRunningTime="2025-12-01 09:34:18.637697889 +0000 UTC m=+1175.906346657" Dec 01 09:34:18 crc kubenswrapper[4763]: I1201 09:34:18.644643 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:19 crc kubenswrapper[4763]: I1201 09:34:19.057265 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" path="/var/lib/kubelet/pods/c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f/volumes" Dec 01 09:34:19 crc kubenswrapper[4763]: I1201 09:34:19.477000 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a15a5c8-5af9-47e2-985f-c98efcdc46f9","Type":"ContainerStarted","Data":"5c5f23cbedf8cb01e336ca0f3c8eff3b685c9a20ea65db5ce171bacad5aebaeb"} Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.487383 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b781378d-290f-4d35-b317-058edd3ae9ea","Type":"ContainerStarted","Data":"180b962ffad1df1a6a8129211d77e901e7e48675be5d1af2ba1c8a662bde876e"} Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.500689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a15a5c8-5af9-47e2-985f-c98efcdc46f9","Type":"ContainerStarted","Data":"33d8d8da1b425d65709f154dfa6e7b54a08e7261d1d3b69e069602105c8d1853"} Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.500853 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerName="cinder-api-log" containerID="cri-o://5c5f23cbedf8cb01e336ca0f3c8eff3b685c9a20ea65db5ce171bacad5aebaeb" gracePeriod=30 Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.501108 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.501323 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerName="cinder-api" containerID="cri-o://33d8d8da1b425d65709f154dfa6e7b54a08e7261d1d3b69e069602105c8d1853" gracePeriod=30 Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.517739 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" event={"ID":"6c4d1d3d-a555-47d6-ac42-29c65c3c0559","Type":"ContainerStarted","Data":"3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b"} Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.518859 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.531164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerStarted","Data":"6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76"} Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.534713 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.534691749 podStartE2EDuration="5.534691749s" podCreationTimestamp="2025-12-01 09:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:20.522839875 +0000 UTC m=+1177.791488653" watchObservedRunningTime="2025-12-01 09:34:20.534691749 +0000 UTC m=+1177.803340517" Dec 01 09:34:20 crc kubenswrapper[4763]: I1201 09:34:20.547740 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" podStartSLOduration=5.547722756 podStartE2EDuration="5.547722756s" podCreationTimestamp="2025-12-01 09:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:20.546237164 +0000 UTC m=+1177.814885932" watchObservedRunningTime="2025-12-01 09:34:20.547722756 +0000 UTC m=+1177.816371524" Dec 01 09:34:20 crc kubenswrapper[4763]: E1201 09:34:20.738194 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a15a5c8_5af9_47e2_985f_c98efcdc46f9.slice/crio-5c5f23cbedf8cb01e336ca0f3c8eff3b685c9a20ea65db5ce171bacad5aebaeb.scope\": RecentStats: unable to find data in memory cache]" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.604703 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerID="33d8d8da1b425d65709f154dfa6e7b54a08e7261d1d3b69e069602105c8d1853" exitCode=0 Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.605226 4763 generic.go:334] "Generic (PLEG): container finished" podID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerID="5c5f23cbedf8cb01e336ca0f3c8eff3b685c9a20ea65db5ce171bacad5aebaeb" exitCode=143 Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.605298 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a15a5c8-5af9-47e2-985f-c98efcdc46f9","Type":"ContainerDied","Data":"33d8d8da1b425d65709f154dfa6e7b54a08e7261d1d3b69e069602105c8d1853"} Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.605325 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a15a5c8-5af9-47e2-985f-c98efcdc46f9","Type":"ContainerDied","Data":"5c5f23cbedf8cb01e336ca0f3c8eff3b685c9a20ea65db5ce171bacad5aebaeb"} Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.605334 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a15a5c8-5af9-47e2-985f-c98efcdc46f9","Type":"ContainerDied","Data":"bcd5fcdfe95f45cae71c5d6552851a09d6e31d8d100038847d2395d949f12747"} Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.605343 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcd5fcdfe95f45cae71c5d6552851a09d6e31d8d100038847d2395d949f12747" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.646727 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerStarted","Data":"dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c"} Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.665417 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b781378d-290f-4d35-b317-058edd3ae9ea","Type":"ContainerStarted","Data":"4835c657b96bc3d4e46c79b97f63c9584b01cb5bd596af4f0b7540a54635bd6f"} Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.673378 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.686613 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.032654376 podStartE2EDuration="6.686591387s" podCreationTimestamp="2025-12-01 09:34:15 +0000 UTC" firstStartedPulling="2025-12-01 09:34:16.952776384 +0000 UTC m=+1174.221425152" lastFinishedPulling="2025-12-01 09:34:18.606713395 +0000 UTC m=+1175.875362163" observedRunningTime="2025-12-01 09:34:21.682735438 +0000 UTC m=+1178.951384216" watchObservedRunningTime="2025-12-01 09:34:21.686591387 +0000 UTC m=+1178.955240145" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.853048 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-scripts\") pod \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.853388 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data-custom\") pod \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.853672 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-combined-ca-bundle\") pod \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.853819 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-logs\") pod \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.853923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-etc-machine-id\") pod \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.854113 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xjcj\" (UniqueName: \"kubernetes.io/projected/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-kube-api-access-2xjcj\") pod \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.854248 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data\") pod \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\" (UID: \"8a15a5c8-5af9-47e2-985f-c98efcdc46f9\") " Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.855100 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8a15a5c8-5af9-47e2-985f-c98efcdc46f9" (UID: "8a15a5c8-5af9-47e2-985f-c98efcdc46f9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.855364 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-logs" (OuterVolumeSpecName: "logs") pod "8a15a5c8-5af9-47e2-985f-c98efcdc46f9" (UID: "8a15a5c8-5af9-47e2-985f-c98efcdc46f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.861430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-scripts" (OuterVolumeSpecName: "scripts") pod "8a15a5c8-5af9-47e2-985f-c98efcdc46f9" (UID: "8a15a5c8-5af9-47e2-985f-c98efcdc46f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.862616 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8a15a5c8-5af9-47e2-985f-c98efcdc46f9" (UID: "8a15a5c8-5af9-47e2-985f-c98efcdc46f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.862824 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-kube-api-access-2xjcj" (OuterVolumeSpecName: "kube-api-access-2xjcj") pod "8a15a5c8-5af9-47e2-985f-c98efcdc46f9" (UID: "8a15a5c8-5af9-47e2-985f-c98efcdc46f9"). InnerVolumeSpecName "kube-api-access-2xjcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.933600 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data" (OuterVolumeSpecName: "config-data") pod "8a15a5c8-5af9-47e2-985f-c98efcdc46f9" (UID: "8a15a5c8-5af9-47e2-985f-c98efcdc46f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.957633 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.957679 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.957694 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xjcj\" (UniqueName: \"kubernetes.io/projected/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-kube-api-access-2xjcj\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.957706 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.957719 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.957731 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:21 crc kubenswrapper[4763]: I1201 09:34:21.969962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a15a5c8-5af9-47e2-985f-c98efcdc46f9" (UID: "8a15a5c8-5af9-47e2-985f-c98efcdc46f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.059237 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a15a5c8-5af9-47e2-985f-c98efcdc46f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.686138 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.817518 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.848740 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.870148 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:22 crc kubenswrapper[4763]: E1201 09:34:22.870843 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" containerName="dnsmasq-dns" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.870861 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" containerName="dnsmasq-dns" Dec 01 09:34:22 crc kubenswrapper[4763]: E1201 09:34:22.870872 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerName="cinder-api" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.870880 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerName="cinder-api" Dec 01 09:34:22 crc kubenswrapper[4763]: E1201 09:34:22.870905 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" containerName="init" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.870912 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" containerName="init" Dec 01 09:34:22 crc kubenswrapper[4763]: E1201 09:34:22.870943 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerName="cinder-api-log" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.870950 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerName="cinder-api-log" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.871099 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2bfd1ff-2eb6-49a1-97e8-4cfcd9406a6f" containerName="dnsmasq-dns" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.871119 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerName="cinder-api-log" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.871129 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" containerName="cinder-api" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.872136 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.878840 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.881353 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.881749 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.881978 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886745 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-scripts\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886810 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054a3443-a215-4987-993e-ea2d282c1d0d-logs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886836 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886862 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886897 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/054a3443-a215-4987-993e-ea2d282c1d0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886913 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-config-data\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886932 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886954 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrwcs\" (UniqueName: \"kubernetes.io/projected/054a3443-a215-4987-993e-ea2d282c1d0d-kube-api-access-wrwcs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.886985 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.988602 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.988716 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/054a3443-a215-4987-993e-ea2d282c1d0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.988749 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-config-data\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.989398 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/054a3443-a215-4987-993e-ea2d282c1d0d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.989418 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.989651 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrwcs\" (UniqueName: \"kubernetes.io/projected/054a3443-a215-4987-993e-ea2d282c1d0d-kube-api-access-wrwcs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.989789 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.989925 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-scripts\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.990097 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054a3443-a215-4987-993e-ea2d282c1d0d-logs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.990157 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.990709 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/054a3443-a215-4987-993e-ea2d282c1d0d-logs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.995015 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-config-data\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:22 crc kubenswrapper[4763]: I1201 09:34:22.996266 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.013586 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-scripts\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.013980 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.015743 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a15a5c8-5af9-47e2-985f-c98efcdc46f9" path="/var/lib/kubelet/pods/8a15a5c8-5af9-47e2-985f-c98efcdc46f9/volumes" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.025871 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-config-data-custom\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.028279 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054a3443-a215-4987-993e-ea2d282c1d0d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.030331 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrwcs\" (UniqueName: \"kubernetes.io/projected/054a3443-a215-4987-993e-ea2d282c1d0d-kube-api-access-wrwcs\") pod \"cinder-api-0\" (UID: \"054a3443-a215-4987-993e-ea2d282c1d0d\") " pod="openstack/cinder-api-0" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.192816 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.591898 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.732820 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerStarted","Data":"346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52"} Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.733949 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.784837 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.207171122 podStartE2EDuration="8.784816209s" podCreationTimestamp="2025-12-01 09:34:15 +0000 UTC" firstStartedPulling="2025-12-01 09:34:17.11331945 +0000 UTC m=+1174.381968218" lastFinishedPulling="2025-12-01 09:34:22.690964527 +0000 UTC m=+1179.959613305" observedRunningTime="2025-12-01 09:34:23.776041472 +0000 UTC m=+1181.044690250" watchObservedRunningTime="2025-12-01 09:34:23.784816209 +0000 UTC m=+1181.053464977" Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.821824 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 01 09:34:23 crc kubenswrapper[4763]: I1201 09:34:23.933806 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:24 crc kubenswrapper[4763]: I1201 09:34:24.715368 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:24 crc kubenswrapper[4763]: I1201 09:34:24.884053 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"054a3443-a215-4987-993e-ea2d282c1d0d","Type":"ContainerStarted","Data":"9a15fa0fa73326b6fc8a226c7a82a7a81d9a1d54c9331295425566d1828c2a2f"} Dec 01 09:34:25 crc kubenswrapper[4763]: I1201 09:34:25.283326 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7cddbbbc75-sb8z7" Dec 01 09:34:25 crc kubenswrapper[4763]: I1201 09:34:25.741746 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 09:34:25 crc kubenswrapper[4763]: I1201 09:34:25.906953 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"054a3443-a215-4987-993e-ea2d282c1d0d","Type":"ContainerStarted","Data":"4002c22a14bdb39dfde530e45aaa33dd3175cf4880979d3d47af86c466e46510"} Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.058705 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.149533 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-cv6kb"] Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.149771 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" podUID="e54f7054-aade-43c0-86b7-5338d093463a" containerName="dnsmasq-dns" containerID="cri-o://b75d0c3be6ced7a2f6c56e2417b325e3bf167200900857ebac092edf62ef58d8" gracePeriod=10 Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.213951 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54896b4dfc-stxgl" Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.229825 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.319532 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d79d596fb-f8qcf"] Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.319834 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d79d596fb-f8qcf" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerName="neutron-api" containerID="cri-o://1e992a86e6515341c22c2de89a92b32a7629cb525b534d1e4af94a02200cf768" gracePeriod=30 Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.319986 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d79d596fb-f8qcf" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerName="neutron-httpd" containerID="cri-o://835d0dbca7e4a633961e6837fbc3d659420f7e5fb545c50f64b71d9f99650e2b" gracePeriod=30 Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.387667 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.914961 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.961579 4763 generic.go:334] "Generic (PLEG): container finished" podID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerID="835d0dbca7e4a633961e6837fbc3d659420f7e5fb545c50f64b71d9f99650e2b" exitCode=0 Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.961633 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d79d596fb-f8qcf" event={"ID":"5b735fb9-9fbf-4d03-9b87-e6d57634813a","Type":"ContainerDied","Data":"835d0dbca7e4a633961e6837fbc3d659420f7e5fb545c50f64b71d9f99650e2b"} Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.985969 4763 generic.go:334] "Generic (PLEG): container finished" podID="e54f7054-aade-43c0-86b7-5338d093463a" containerID="b75d0c3be6ced7a2f6c56e2417b325e3bf167200900857ebac092edf62ef58d8" exitCode=0 Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.986098 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bbd79555b-kk8vr" Dec 01 09:34:26 crc kubenswrapper[4763]: I1201 09:34:26.986139 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" event={"ID":"e54f7054-aade-43c0-86b7-5338d093463a","Type":"ContainerDied","Data":"b75d0c3be6ced7a2f6c56e2417b325e3bf167200900857ebac092edf62ef58d8"} Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.001698 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" containerName="cinder-scheduler" containerID="cri-o://180b962ffad1df1a6a8129211d77e901e7e48675be5d1af2ba1c8a662bde876e" gracePeriod=30 Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.002797 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" containerName="probe" containerID="cri-o://4835c657b96bc3d4e46c79b97f63c9584b01cb5bd596af4f0b7540a54635bd6f" gracePeriod=30 Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.017647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"054a3443-a215-4987-993e-ea2d282c1d0d","Type":"ContainerStarted","Data":"b5aee2fd0f536994eea5873779d6ca190b51172039776669e431270602726a54"} Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.017730 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.080013 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.089873 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.089846498 podStartE2EDuration="5.089846498s" podCreationTimestamp="2025-12-01 09:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:27.080233947 +0000 UTC m=+1184.348882715" watchObservedRunningTime="2025-12-01 09:34:27.089846498 +0000 UTC m=+1184.358495266" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.278376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-nb\") pod \"e54f7054-aade-43c0-86b7-5338d093463a\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.278769 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drkqn\" (UniqueName: \"kubernetes.io/projected/e54f7054-aade-43c0-86b7-5338d093463a-kube-api-access-drkqn\") pod \"e54f7054-aade-43c0-86b7-5338d093463a\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.278812 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-config\") pod \"e54f7054-aade-43c0-86b7-5338d093463a\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.278922 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-sb\") pod \"e54f7054-aade-43c0-86b7-5338d093463a\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.278983 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-dns-svc\") pod \"e54f7054-aade-43c0-86b7-5338d093463a\" (UID: \"e54f7054-aade-43c0-86b7-5338d093463a\") " Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.304746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54f7054-aade-43c0-86b7-5338d093463a-kube-api-access-drkqn" (OuterVolumeSpecName: "kube-api-access-drkqn") pod "e54f7054-aade-43c0-86b7-5338d093463a" (UID: "e54f7054-aade-43c0-86b7-5338d093463a"). InnerVolumeSpecName "kube-api-access-drkqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.367101 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-config" (OuterVolumeSpecName: "config") pod "e54f7054-aade-43c0-86b7-5338d093463a" (UID: "e54f7054-aade-43c0-86b7-5338d093463a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.382601 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drkqn\" (UniqueName: \"kubernetes.io/projected/e54f7054-aade-43c0-86b7-5338d093463a-kube-api-access-drkqn\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.382631 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.426099 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e54f7054-aade-43c0-86b7-5338d093463a" (UID: "e54f7054-aade-43c0-86b7-5338d093463a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.477932 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e54f7054-aade-43c0-86b7-5338d093463a" (UID: "e54f7054-aade-43c0-86b7-5338d093463a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.479976 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e54f7054-aade-43c0-86b7-5338d093463a" (UID: "e54f7054-aade-43c0-86b7-5338d093463a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.484668 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.484759 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.484824 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e54f7054-aade-43c0-86b7-5338d093463a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.859721 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-d56769fd6-btjcl" podUID="6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.147:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:27 crc kubenswrapper[4763]: I1201 09:34:27.859721 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-d56769fd6-btjcl" podUID="6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.147:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.012477 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.012698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-cv6kb" event={"ID":"e54f7054-aade-43c0-86b7-5338d093463a","Type":"ContainerDied","Data":"3f55e315acf1e84451c4d0f1640190f762ab978b61a218e5a96832930de5c769"} Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.012732 4763 scope.go:117] "RemoveContainer" containerID="b75d0c3be6ced7a2f6c56e2417b325e3bf167200900857ebac092edf62ef58d8" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.043038 4763 scope.go:117] "RemoveContainer" containerID="509253a83b272760ff83bef8e71b01ef7f4f66cc2a222ca986db3f484956e4aa" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.076370 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-cv6kb"] Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.125203 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-cv6kb"] Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.311953 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 09:34:28 crc kubenswrapper[4763]: E1201 09:34:28.312482 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54f7054-aade-43c0-86b7-5338d093463a" containerName="init" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.312506 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54f7054-aade-43c0-86b7-5338d093463a" containerName="init" Dec 01 09:34:28 crc kubenswrapper[4763]: E1201 09:34:28.312572 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54f7054-aade-43c0-86b7-5338d093463a" containerName="dnsmasq-dns" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.312580 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54f7054-aade-43c0-86b7-5338d093463a" containerName="dnsmasq-dns" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.312833 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54f7054-aade-43c0-86b7-5338d093463a" containerName="dnsmasq-dns" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.313606 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.316911 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-99tdx" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.317592 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.322234 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.346447 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.413348 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config-secret\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.413431 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.413518 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.413597 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvx28\" (UniqueName: \"kubernetes.io/projected/1e160697-4938-43b5-b4e3-97d8199c8f03-kube-api-access-dvx28\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.515218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config-secret\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.515308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.515392 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.515539 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvx28\" (UniqueName: \"kubernetes.io/projected/1e160697-4938-43b5-b4e3-97d8199c8f03-kube-api-access-dvx28\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.517185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.525664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config-secret\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.544426 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.555172 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvx28\" (UniqueName: \"kubernetes.io/projected/1e160697-4938-43b5-b4e3-97d8199c8f03-kube-api-access-dvx28\") pod \"openstackclient\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.638659 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.652342 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.667059 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.727826 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.729476 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.739770 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.830501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2wj\" (UniqueName: \"kubernetes.io/projected/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-kube-api-access-5s2wj\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.830778 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-openstack-config\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.831022 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.831046 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.863879 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d56769fd6-btjcl" podUID="6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.147:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.863899 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d56769fd6-btjcl" podUID="6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.147:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.933061 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2wj\" (UniqueName: \"kubernetes.io/projected/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-kube-api-access-5s2wj\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.933135 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-openstack-config\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.933220 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.933252 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.935672 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-openstack-config\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: E1201 09:34:28.943369 4763 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 01 09:34:28 crc kubenswrapper[4763]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_1e160697-4938-43b5-b4e3-97d8199c8f03_0(f3d6e70aced9c191b6db31be14b1eef7f2edb686499f3155edb7d825253b1a42): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f3d6e70aced9c191b6db31be14b1eef7f2edb686499f3155edb7d825253b1a42" Netns:"/var/run/netns/d764331a-2ba5-4613-9dac-f621b76c584b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f3d6e70aced9c191b6db31be14b1eef7f2edb686499f3155edb7d825253b1a42;K8S_POD_UID=1e160697-4938-43b5-b4e3-97d8199c8f03" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/1e160697-4938-43b5-b4e3-97d8199c8f03]: expected pod UID "1e160697-4938-43b5-b4e3-97d8199c8f03" but got "b22eceb2-ee23-4a6c-a993-bb280fe2d41f" from Kube API Dec 01 09:34:28 crc kubenswrapper[4763]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 01 09:34:28 crc kubenswrapper[4763]: > Dec 01 09:34:28 crc kubenswrapper[4763]: E1201 09:34:28.943469 4763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 01 09:34:28 crc kubenswrapper[4763]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_1e160697-4938-43b5-b4e3-97d8199c8f03_0(f3d6e70aced9c191b6db31be14b1eef7f2edb686499f3155edb7d825253b1a42): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f3d6e70aced9c191b6db31be14b1eef7f2edb686499f3155edb7d825253b1a42" Netns:"/var/run/netns/d764331a-2ba5-4613-9dac-f621b76c584b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f3d6e70aced9c191b6db31be14b1eef7f2edb686499f3155edb7d825253b1a42;K8S_POD_UID=1e160697-4938-43b5-b4e3-97d8199c8f03" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/1e160697-4938-43b5-b4e3-97d8199c8f03]: expected pod UID "1e160697-4938-43b5-b4e3-97d8199c8f03" but got "b22eceb2-ee23-4a6c-a993-bb280fe2d41f" from Kube API Dec 01 09:34:28 crc kubenswrapper[4763]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 01 09:34:28 crc kubenswrapper[4763]: > pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.944175 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.944805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:28 crc kubenswrapper[4763]: I1201 09:34:28.961105 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2wj\" (UniqueName: \"kubernetes.io/projected/b22eceb2-ee23-4a6c-a993-bb280fe2d41f-kube-api-access-5s2wj\") pod \"openstackclient\" (UID: \"b22eceb2-ee23-4a6c-a993-bb280fe2d41f\") " pod="openstack/openstackclient" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.016479 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e54f7054-aade-43c0-86b7-5338d093463a" path="/var/lib/kubelet/pods/e54f7054-aade-43c0-86b7-5338d093463a/volumes" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.082497 4763 generic.go:334] "Generic (PLEG): container finished" podID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerID="1e992a86e6515341c22c2de89a92b32a7629cb525b534d1e4af94a02200cf768" exitCode=0 Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.082566 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d79d596fb-f8qcf" event={"ID":"5b735fb9-9fbf-4d03-9b87-e6d57634813a","Type":"ContainerDied","Data":"1e992a86e6515341c22c2de89a92b32a7629cb525b534d1e4af94a02200cf768"} Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.086209 4763 generic.go:334] "Generic (PLEG): container finished" podID="b781378d-290f-4d35-b317-058edd3ae9ea" containerID="4835c657b96bc3d4e46c79b97f63c9584b01cb5bd596af4f0b7540a54635bd6f" exitCode=0 Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.086234 4763 generic.go:334] "Generic (PLEG): container finished" podID="b781378d-290f-4d35-b317-058edd3ae9ea" containerID="180b962ffad1df1a6a8129211d77e901e7e48675be5d1af2ba1c8a662bde876e" exitCode=0 Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.086282 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.086819 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b781378d-290f-4d35-b317-058edd3ae9ea","Type":"ContainerDied","Data":"4835c657b96bc3d4e46c79b97f63c9584b01cb5bd596af4f0b7540a54635bd6f"} Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.086841 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b781378d-290f-4d35-b317-058edd3ae9ea","Type":"ContainerDied","Data":"180b962ffad1df1a6a8129211d77e901e7e48675be5d1af2ba1c8a662bde876e"} Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.115214 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.119301 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1e160697-4938-43b5-b4e3-97d8199c8f03" podUID="b22eceb2-ee23-4a6c-a993-bb280fe2d41f" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.148467 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.244895 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-combined-ca-bundle\") pod \"1e160697-4938-43b5-b4e3-97d8199c8f03\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.244998 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config-secret\") pod \"1e160697-4938-43b5-b4e3-97d8199c8f03\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.245074 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvx28\" (UniqueName: \"kubernetes.io/projected/1e160697-4938-43b5-b4e3-97d8199c8f03-kube-api-access-dvx28\") pod \"1e160697-4938-43b5-b4e3-97d8199c8f03\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.245244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config\") pod \"1e160697-4938-43b5-b4e3-97d8199c8f03\" (UID: \"1e160697-4938-43b5-b4e3-97d8199c8f03\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.246477 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1e160697-4938-43b5-b4e3-97d8199c8f03" (UID: "1e160697-4938-43b5-b4e3-97d8199c8f03"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.253578 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1e160697-4938-43b5-b4e3-97d8199c8f03" (UID: "1e160697-4938-43b5-b4e3-97d8199c8f03"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.256176 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e160697-4938-43b5-b4e3-97d8199c8f03" (UID: "1e160697-4938-43b5-b4e3-97d8199c8f03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.261624 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e160697-4938-43b5-b4e3-97d8199c8f03-kube-api-access-dvx28" (OuterVolumeSpecName: "kube-api-access-dvx28") pod "1e160697-4938-43b5-b4e3-97d8199c8f03" (UID: "1e160697-4938-43b5-b4e3-97d8199c8f03"). InnerVolumeSpecName "kube-api-access-dvx28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.361895 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.361936 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvx28\" (UniqueName: \"kubernetes.io/projected/1e160697-4938-43b5-b4e3-97d8199c8f03-kube-api-access-dvx28\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.361946 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1e160697-4938-43b5-b4e3-97d8199c8f03-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.361953 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e160697-4938-43b5-b4e3-97d8199c8f03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.451893 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.552351 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.568985 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-config\") pod \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.569061 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-combined-ca-bundle\") pod \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.569125 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-httpd-config\") pod \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.569179 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvddj\" (UniqueName: \"kubernetes.io/projected/5b735fb9-9fbf-4d03-9b87-e6d57634813a-kube-api-access-qvddj\") pod \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.569233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-ovndb-tls-certs\") pod \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\" (UID: \"5b735fb9-9fbf-4d03-9b87-e6d57634813a\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.633568 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5b735fb9-9fbf-4d03-9b87-e6d57634813a" (UID: "5b735fb9-9fbf-4d03-9b87-e6d57634813a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.677132 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data-custom\") pod \"b781378d-290f-4d35-b317-058edd3ae9ea\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.677300 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b781378d-290f-4d35-b317-058edd3ae9ea-etc-machine-id\") pod \"b781378d-290f-4d35-b317-058edd3ae9ea\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.677400 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data\") pod \"b781378d-290f-4d35-b317-058edd3ae9ea\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.677467 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4z7b\" (UniqueName: \"kubernetes.io/projected/b781378d-290f-4d35-b317-058edd3ae9ea-kube-api-access-x4z7b\") pod \"b781378d-290f-4d35-b317-058edd3ae9ea\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.677500 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-combined-ca-bundle\") pod \"b781378d-290f-4d35-b317-058edd3ae9ea\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.677524 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-scripts\") pod \"b781378d-290f-4d35-b317-058edd3ae9ea\" (UID: \"b781378d-290f-4d35-b317-058edd3ae9ea\") " Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.677935 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.682630 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b735fb9-9fbf-4d03-9b87-e6d57634813a-kube-api-access-qvddj" (OuterVolumeSpecName: "kube-api-access-qvddj") pod "5b735fb9-9fbf-4d03-9b87-e6d57634813a" (UID: "5b735fb9-9fbf-4d03-9b87-e6d57634813a"). InnerVolumeSpecName "kube-api-access-qvddj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.690998 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b781378d-290f-4d35-b317-058edd3ae9ea-kube-api-access-x4z7b" (OuterVolumeSpecName: "kube-api-access-x4z7b") pod "b781378d-290f-4d35-b317-058edd3ae9ea" (UID: "b781378d-290f-4d35-b317-058edd3ae9ea"). InnerVolumeSpecName "kube-api-access-x4z7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.692598 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b781378d-290f-4d35-b317-058edd3ae9ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b781378d-290f-4d35-b317-058edd3ae9ea" (UID: "b781378d-290f-4d35-b317-058edd3ae9ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.692808 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.701272 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b781378d-290f-4d35-b317-058edd3ae9ea" (UID: "b781378d-290f-4d35-b317-058edd3ae9ea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.721079 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-scripts" (OuterVolumeSpecName: "scripts") pod "b781378d-290f-4d35-b317-058edd3ae9ea" (UID: "b781378d-290f-4d35-b317-058edd3ae9ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.760964 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.781387 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4z7b\" (UniqueName: \"kubernetes.io/projected/b781378d-290f-4d35-b317-058edd3ae9ea-kube-api-access-x4z7b\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.781664 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.781737 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.781791 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b781378d-290f-4d35-b317-058edd3ae9ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.781843 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvddj\" (UniqueName: \"kubernetes.io/projected/5b735fb9-9fbf-4d03-9b87-e6d57634813a-kube-api-access-qvddj\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.818533 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b781378d-290f-4d35-b317-058edd3ae9ea" (UID: "b781378d-290f-4d35-b317-058edd3ae9ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.823598 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b735fb9-9fbf-4d03-9b87-e6d57634813a" (UID: "5b735fb9-9fbf-4d03-9b87-e6d57634813a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.823690 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-config" (OuterVolumeSpecName: "config") pod "5b735fb9-9fbf-4d03-9b87-e6d57634813a" (UID: "5b735fb9-9fbf-4d03-9b87-e6d57634813a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.866377 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5b735fb9-9fbf-4d03-9b87-e6d57634813a" (UID: "5b735fb9-9fbf-4d03-9b87-e6d57634813a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.885401 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.885431 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.885448 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.885479 4763 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b735fb9-9fbf-4d03-9b87-e6d57634813a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.962579 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data" (OuterVolumeSpecName: "config-data") pod "b781378d-290f-4d35-b317-058edd3ae9ea" (UID: "b781378d-290f-4d35-b317-058edd3ae9ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:29 crc kubenswrapper[4763]: I1201 09:34:29.990486 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b781378d-290f-4d35-b317-058edd3ae9ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.087144 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.102422 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b781378d-290f-4d35-b317-058edd3ae9ea","Type":"ContainerDied","Data":"6bedf941dd155076cbcf29c4ca5fa917e1d32894be027c4b6db3f98bd927e190"} Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.102833 4763 scope.go:117] "RemoveContainer" containerID="4835c657b96bc3d4e46c79b97f63c9584b01cb5bd596af4f0b7540a54635bd6f" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.102529 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: W1201 09:34:30.103827 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb22eceb2_ee23_4a6c_a993_bb280fe2d41f.slice/crio-386c533274ddeb0975eb716b4fcc136013097a2af9c1e0608a3c72fc4f649760 WatchSource:0}: Error finding container 386c533274ddeb0975eb716b4fcc136013097a2af9c1e0608a3c72fc4f649760: Status 404 returned error can't find the container with id 386c533274ddeb0975eb716b4fcc136013097a2af9c1e0608a3c72fc4f649760 Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.110820 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.112585 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d79d596fb-f8qcf" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.114529 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d79d596fb-f8qcf" event={"ID":"5b735fb9-9fbf-4d03-9b87-e6d57634813a","Type":"ContainerDied","Data":"c5333fbb295544ff584f064f8b4b80e7421cc48beed1a35467edd56b22f1e52b"} Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.172638 4763 scope.go:117] "RemoveContainer" containerID="180b962ffad1df1a6a8129211d77e901e7e48675be5d1af2ba1c8a662bde876e" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.295233 4763 scope.go:117] "RemoveContainer" containerID="835d0dbca7e4a633961e6837fbc3d659420f7e5fb545c50f64b71d9f99650e2b" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.311819 4763 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1e160697-4938-43b5-b4e3-97d8199c8f03" podUID="b22eceb2-ee23-4a6c-a993-bb280fe2d41f" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.322653 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d79d596fb-f8qcf"] Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.344796 4763 scope.go:117] "RemoveContainer" containerID="1e992a86e6515341c22c2de89a92b32a7629cb525b534d1e4af94a02200cf768" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.361500 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d79d596fb-f8qcf"] Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.382441 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.398538 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.416527 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:30 crc kubenswrapper[4763]: E1201 09:34:30.416968 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerName="neutron-api" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.416995 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerName="neutron-api" Dec 01 09:34:30 crc kubenswrapper[4763]: E1201 09:34:30.417012 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerName="neutron-httpd" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.417021 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerName="neutron-httpd" Dec 01 09:34:30 crc kubenswrapper[4763]: E1201 09:34:30.417031 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" containerName="probe" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.417038 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" containerName="probe" Dec 01 09:34:30 crc kubenswrapper[4763]: E1201 09:34:30.417053 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" containerName="cinder-scheduler" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.417062 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" containerName="cinder-scheduler" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.417248 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerName="neutron-api" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.417265 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" containerName="probe" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.417274 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" containerName="neutron-httpd" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.417284 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" containerName="cinder-scheduler" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.418208 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.431255 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.471967 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.499570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.499615 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.499647 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.499772 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7475a3a-70b9-44d0-94b2-3c3890185f85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.499852 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.500089 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxkw\" (UniqueName: \"kubernetes.io/projected/e7475a3a-70b9-44d0-94b2-3c3890185f85-kube-api-access-zcxkw\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.601997 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.602081 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7475a3a-70b9-44d0-94b2-3c3890185f85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.602128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.602255 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxkw\" (UniqueName: \"kubernetes.io/projected/e7475a3a-70b9-44d0-94b2-3c3890185f85-kube-api-access-zcxkw\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.602301 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.602330 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.602818 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7475a3a-70b9-44d0-94b2-3c3890185f85-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.608638 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.609117 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.611167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-scripts\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.623232 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxkw\" (UniqueName: \"kubernetes.io/projected/e7475a3a-70b9-44d0-94b2-3c3890185f85-kube-api-access-zcxkw\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.629228 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7475a3a-70b9-44d0-94b2-3c3890185f85-config-data\") pod \"cinder-scheduler-0\" (UID: \"e7475a3a-70b9-44d0-94b2-3c3890185f85\") " pod="openstack/cinder-scheduler-0" Dec 01 09:34:30 crc kubenswrapper[4763]: I1201 09:34:30.861902 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 01 09:34:31 crc kubenswrapper[4763]: I1201 09:34:31.012620 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e160697-4938-43b5-b4e3-97d8199c8f03" path="/var/lib/kubelet/pods/1e160697-4938-43b5-b4e3-97d8199c8f03/volumes" Dec 01 09:34:31 crc kubenswrapper[4763]: I1201 09:34:31.013324 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b735fb9-9fbf-4d03-9b87-e6d57634813a" path="/var/lib/kubelet/pods/5b735fb9-9fbf-4d03-9b87-e6d57634813a/volumes" Dec 01 09:34:31 crc kubenswrapper[4763]: I1201 09:34:31.014138 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b781378d-290f-4d35-b317-058edd3ae9ea" path="/var/lib/kubelet/pods/b781378d-290f-4d35-b317-058edd3ae9ea/volumes" Dec 01 09:34:31 crc kubenswrapper[4763]: I1201 09:34:31.142175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b22eceb2-ee23-4a6c-a993-bb280fe2d41f","Type":"ContainerStarted","Data":"386c533274ddeb0975eb716b4fcc136013097a2af9c1e0608a3c72fc4f649760"} Dec 01 09:34:31 crc kubenswrapper[4763]: I1201 09:34:31.492444 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 01 09:34:31 crc kubenswrapper[4763]: I1201 09:34:31.975029 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:32 crc kubenswrapper[4763]: I1201 09:34:32.151260 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d56769fd6-btjcl" Dec 01 09:34:32 crc kubenswrapper[4763]: I1201 09:34:32.156415 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7475a3a-70b9-44d0-94b2-3c3890185f85","Type":"ContainerStarted","Data":"e6dfd9e8983ed65daa8dc7ff44a1238ec1dc41834711b123c9b78d8537098376"} Dec 01 09:34:32 crc kubenswrapper[4763]: I1201 09:34:32.222678 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-798f56bcd6-5xxds"] Dec 01 09:34:32 crc kubenswrapper[4763]: I1201 09:34:32.223265 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" containerID="cri-o://8a7f3305afcda3d190dc2a73b96d745bd8c6ca19d45c076bfd46d6f14de55911" gracePeriod=30 Dec 01 09:34:32 crc kubenswrapper[4763]: I1201 09:34:32.223495 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" containerID="cri-o://eb45742bfd1f12811b5bffd38a92fc7bbd55ee195e72e69b8300303b10cdd6bb" gracePeriod=30 Dec 01 09:34:33 crc kubenswrapper[4763]: I1201 09:34:33.173362 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": EOF" Dec 01 09:34:33 crc kubenswrapper[4763]: I1201 09:34:33.173763 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": EOF" Dec 01 09:34:33 crc kubenswrapper[4763]: I1201 09:34:33.929893 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:34:33 crc kubenswrapper[4763]: I1201 09:34:33.930305 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:34:34 crc kubenswrapper[4763]: I1201 09:34:34.174603 4763 generic.go:334] "Generic (PLEG): container finished" podID="b94d6350-7336-4142-8909-bb3c5e09412f" containerID="8a7f3305afcda3d190dc2a73b96d745bd8c6ca19d45c076bfd46d6f14de55911" exitCode=143 Dec 01 09:34:34 crc kubenswrapper[4763]: I1201 09:34:34.174656 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-798f56bcd6-5xxds" event={"ID":"b94d6350-7336-4142-8909-bb3c5e09412f","Type":"ContainerDied","Data":"8a7f3305afcda3d190dc2a73b96d745bd8c6ca19d45c076bfd46d6f14de55911"} Dec 01 09:34:36 crc kubenswrapper[4763]: I1201 09:34:35.187313 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7475a3a-70b9-44d0-94b2-3c3890185f85","Type":"ContainerStarted","Data":"e013b56937cc12a9f146539170eb61293fb224a25f43573519df711ab5956461"} Dec 01 09:34:36 crc kubenswrapper[4763]: I1201 09:34:35.939687 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-cg9jb" podUID="9a10b30c-69e2-4037-bc91-dfd5191a6e72" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:36 crc kubenswrapper[4763]: I1201 09:34:36.642768 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:54306->10.217.0.146:9311: read: connection reset by peer" Dec 01 09:34:36 crc kubenswrapper[4763]: I1201 09:34:36.643728 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:54318->10.217.0.146:9311: read: connection reset by peer" Dec 01 09:34:37 crc kubenswrapper[4763]: I1201 09:34:37.203873 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="054a3443-a215-4987-993e-ea2d282c1d0d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.152:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:37 crc kubenswrapper[4763]: I1201 09:34:37.218660 4763 generic.go:334] "Generic (PLEG): container finished" podID="b94d6350-7336-4142-8909-bb3c5e09412f" containerID="eb45742bfd1f12811b5bffd38a92fc7bbd55ee195e72e69b8300303b10cdd6bb" exitCode=0 Dec 01 09:34:37 crc kubenswrapper[4763]: I1201 09:34:37.218734 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-798f56bcd6-5xxds" event={"ID":"b94d6350-7336-4142-8909-bb3c5e09412f","Type":"ContainerDied","Data":"eb45742bfd1f12811b5bffd38a92fc7bbd55ee195e72e69b8300303b10cdd6bb"} Dec 01 09:34:37 crc kubenswrapper[4763]: I1201 09:34:37.220269 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e7475a3a-70b9-44d0-94b2-3c3890185f85","Type":"ContainerStarted","Data":"c5f114484ff47726ff0e2c667ccdde8fad6d52a0d1c5fff9e01072421527620d"} Dec 01 09:34:37 crc kubenswrapper[4763]: I1201 09:34:37.245167 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.245150504 podStartE2EDuration="7.245150504s" podCreationTimestamp="2025-12-01 09:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:37.242321164 +0000 UTC m=+1194.510969922" watchObservedRunningTime="2025-12-01 09:34:37.245150504 +0000 UTC m=+1194.513799272" Dec 01 09:34:38 crc kubenswrapper[4763]: I1201 09:34:38.197729 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="054a3443-a215-4987-993e-ea2d282c1d0d" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.152:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:40 crc kubenswrapper[4763]: I1201 09:34:40.645604 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 01 09:34:40 crc kubenswrapper[4763]: I1201 09:34:40.863787 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 01 09:34:41 crc kubenswrapper[4763]: I1201 09:34:41.147274 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 01 09:34:45 crc kubenswrapper[4763]: I1201 09:34:45.447044 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:45 crc kubenswrapper[4763]: I1201 09:34:45.447931 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="ceilometer-central-agent" containerID="cri-o://3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08" gracePeriod=30 Dec 01 09:34:45 crc kubenswrapper[4763]: I1201 09:34:45.448588 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="proxy-httpd" containerID="cri-o://346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52" gracePeriod=30 Dec 01 09:34:45 crc kubenswrapper[4763]: I1201 09:34:45.448596 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="ceilometer-notification-agent" containerID="cri-o://6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76" gracePeriod=30 Dec 01 09:34:45 crc kubenswrapper[4763]: I1201 09:34:45.448673 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="sg-core" containerID="cri-o://dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c" gracePeriod=30 Dec 01 09:34:45 crc kubenswrapper[4763]: I1201 09:34:45.462503 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.149:3000/\": EOF" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.002972 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.149:3000/\": dial tcp 10.217.0.149:3000: connect: connection refused" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.013743 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": dial tcp 10.217.0.146:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.013763 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.319063 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.350045 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-798f56bcd6-5xxds" event={"ID":"b94d6350-7336-4142-8909-bb3c5e09412f","Type":"ContainerDied","Data":"9bea4edd035f74cca4a29915c45f1bc4ba4ec12c293ba206750f768695194013"} Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.350128 4763 scope.go:117] "RemoveContainer" containerID="eb45742bfd1f12811b5bffd38a92fc7bbd55ee195e72e69b8300303b10cdd6bb" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.350264 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-798f56bcd6-5xxds" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.365423 4763 generic.go:334] "Generic (PLEG): container finished" podID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerID="346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52" exitCode=0 Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.365543 4763 generic.go:334] "Generic (PLEG): container finished" podID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerID="dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c" exitCode=2 Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.365558 4763 generic.go:334] "Generic (PLEG): container finished" podID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerID="3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08" exitCode=0 Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.365583 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerDied","Data":"346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52"} Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.365615 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerDied","Data":"dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c"} Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.365628 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerDied","Data":"3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08"} Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.405602 4763 scope.go:117] "RemoveContainer" containerID="8a7f3305afcda3d190dc2a73b96d745bd8c6ca19d45c076bfd46d6f14de55911" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.444700 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data\") pod \"b94d6350-7336-4142-8909-bb3c5e09412f\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.444867 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b94d6350-7336-4142-8909-bb3c5e09412f-logs\") pod \"b94d6350-7336-4142-8909-bb3c5e09412f\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.444913 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lpmt\" (UniqueName: \"kubernetes.io/projected/b94d6350-7336-4142-8909-bb3c5e09412f-kube-api-access-9lpmt\") pod \"b94d6350-7336-4142-8909-bb3c5e09412f\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.444937 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-combined-ca-bundle\") pod \"b94d6350-7336-4142-8909-bb3c5e09412f\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.445027 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data-custom\") pod \"b94d6350-7336-4142-8909-bb3c5e09412f\" (UID: \"b94d6350-7336-4142-8909-bb3c5e09412f\") " Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.445750 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94d6350-7336-4142-8909-bb3c5e09412f-logs" (OuterVolumeSpecName: "logs") pod "b94d6350-7336-4142-8909-bb3c5e09412f" (UID: "b94d6350-7336-4142-8909-bb3c5e09412f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.452413 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b94d6350-7336-4142-8909-bb3c5e09412f" (UID: "b94d6350-7336-4142-8909-bb3c5e09412f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.453896 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b94d6350-7336-4142-8909-bb3c5e09412f-kube-api-access-9lpmt" (OuterVolumeSpecName: "kube-api-access-9lpmt") pod "b94d6350-7336-4142-8909-bb3c5e09412f" (UID: "b94d6350-7336-4142-8909-bb3c5e09412f"). InnerVolumeSpecName "kube-api-access-9lpmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.473847 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b94d6350-7336-4142-8909-bb3c5e09412f" (UID: "b94d6350-7336-4142-8909-bb3c5e09412f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.498857 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data" (OuterVolumeSpecName: "config-data") pod "b94d6350-7336-4142-8909-bb3c5e09412f" (UID: "b94d6350-7336-4142-8909-bb3c5e09412f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.546644 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b94d6350-7336-4142-8909-bb3c5e09412f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.546679 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lpmt\" (UniqueName: \"kubernetes.io/projected/b94d6350-7336-4142-8909-bb3c5e09412f-kube-api-access-9lpmt\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.546693 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.546702 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.546711 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b94d6350-7336-4142-8909-bb3c5e09412f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.684695 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-798f56bcd6-5xxds"] Dec 01 09:34:46 crc kubenswrapper[4763]: I1201 09:34:46.693122 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-798f56bcd6-5xxds"] Dec 01 09:34:47 crc kubenswrapper[4763]: I1201 09:34:47.005537 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" path="/var/lib/kubelet/pods/b94d6350-7336-4142-8909-bb3c5e09412f/volumes" Dec 01 09:34:47 crc kubenswrapper[4763]: I1201 09:34:47.376311 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b22eceb2-ee23-4a6c-a993-bb280fe2d41f","Type":"ContainerStarted","Data":"215c545883f65f3ef143dec5958063174ee1d86056e8f8bff0bdceb59dd6b9f4"} Dec 01 09:34:47 crc kubenswrapper[4763]: I1201 09:34:47.402102 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.364968711 podStartE2EDuration="19.402084016s" podCreationTimestamp="2025-12-01 09:34:28 +0000 UTC" firstStartedPulling="2025-12-01 09:34:30.109504633 +0000 UTC m=+1187.378153391" lastFinishedPulling="2025-12-01 09:34:46.146619928 +0000 UTC m=+1203.415268696" observedRunningTime="2025-12-01 09:34:47.395501251 +0000 UTC m=+1204.664150019" watchObservedRunningTime="2025-12-01 09:34:47.402084016 +0000 UTC m=+1204.670732784" Dec 01 09:34:47 crc kubenswrapper[4763]: I1201 09:34:47.987878 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.178364 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-combined-ca-bundle\") pod \"523216bb-3506-4d7e-a7f7-7a50af4d838e\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.178405 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-config-data\") pod \"523216bb-3506-4d7e-a7f7-7a50af4d838e\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.178441 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-scripts\") pod \"523216bb-3506-4d7e-a7f7-7a50af4d838e\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.178480 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnrch\" (UniqueName: \"kubernetes.io/projected/523216bb-3506-4d7e-a7f7-7a50af4d838e-kube-api-access-tnrch\") pod \"523216bb-3506-4d7e-a7f7-7a50af4d838e\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.178530 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-log-httpd\") pod \"523216bb-3506-4d7e-a7f7-7a50af4d838e\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.178548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-run-httpd\") pod \"523216bb-3506-4d7e-a7f7-7a50af4d838e\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.178563 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-sg-core-conf-yaml\") pod \"523216bb-3506-4d7e-a7f7-7a50af4d838e\" (UID: \"523216bb-3506-4d7e-a7f7-7a50af4d838e\") " Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.179187 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "523216bb-3506-4d7e-a7f7-7a50af4d838e" (UID: "523216bb-3506-4d7e-a7f7-7a50af4d838e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.179491 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "523216bb-3506-4d7e-a7f7-7a50af4d838e" (UID: "523216bb-3506-4d7e-a7f7-7a50af4d838e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.199762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523216bb-3506-4d7e-a7f7-7a50af4d838e-kube-api-access-tnrch" (OuterVolumeSpecName: "kube-api-access-tnrch") pod "523216bb-3506-4d7e-a7f7-7a50af4d838e" (UID: "523216bb-3506-4d7e-a7f7-7a50af4d838e"). InnerVolumeSpecName "kube-api-access-tnrch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.204947 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-scripts" (OuterVolumeSpecName: "scripts") pod "523216bb-3506-4d7e-a7f7-7a50af4d838e" (UID: "523216bb-3506-4d7e-a7f7-7a50af4d838e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.216639 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "523216bb-3506-4d7e-a7f7-7a50af4d838e" (UID: "523216bb-3506-4d7e-a7f7-7a50af4d838e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.276599 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "523216bb-3506-4d7e-a7f7-7a50af4d838e" (UID: "523216bb-3506-4d7e-a7f7-7a50af4d838e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.280881 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.281080 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.281323 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnrch\" (UniqueName: \"kubernetes.io/projected/523216bb-3506-4d7e-a7f7-7a50af4d838e-kube-api-access-tnrch\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.281356 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.281370 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/523216bb-3506-4d7e-a7f7-7a50af4d838e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.281382 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.286786 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-config-data" (OuterVolumeSpecName: "config-data") pod "523216bb-3506-4d7e-a7f7-7a50af4d838e" (UID: "523216bb-3506-4d7e-a7f7-7a50af4d838e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.383428 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/523216bb-3506-4d7e-a7f7-7a50af4d838e-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.389298 4763 generic.go:334] "Generic (PLEG): container finished" podID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerID="6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76" exitCode=0 Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.389366 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.389428 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerDied","Data":"6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76"} Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.389474 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"523216bb-3506-4d7e-a7f7-7a50af4d838e","Type":"ContainerDied","Data":"f7ac24bfe146cc97258c4abd08a0bd099badc221a0a7b0f1e857cfc5cc5c55c9"} Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.389496 4763 scope.go:117] "RemoveContainer" containerID="346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.414737 4763 scope.go:117] "RemoveContainer" containerID="dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.421926 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.439631 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.442774 4763 scope.go:117] "RemoveContainer" containerID="6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.453696 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.454129 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.454223 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.454292 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="ceilometer-central-agent" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.454348 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="ceilometer-central-agent" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.454414 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="proxy-httpd" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.454486 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="proxy-httpd" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.454593 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="sg-core" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.454668 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="sg-core" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.454755 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="ceilometer-notification-agent" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.456228 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="ceilometer-notification-agent" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.456313 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.456429 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.462030 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="sg-core" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.467067 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="ceilometer-notification-agent" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.467279 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="ceilometer-central-agent" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.467363 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.467714 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" containerName="proxy-httpd" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.470855 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.474117 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.474238 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.493075 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.493290 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.495456 4763 scope.go:117] "RemoveContainer" containerID="3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.550433 4763 scope.go:117] "RemoveContainer" containerID="346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.550893 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52\": container with ID starting with 346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52 not found: ID does not exist" containerID="346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.550927 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52"} err="failed to get container status \"346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52\": rpc error: code = NotFound desc = could not find container \"346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52\": container with ID starting with 346d0331bd7f6e6d2f89b46280d7d4a937fcf6314ed3ad3cc526e2e8df421a52 not found: ID does not exist" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.550950 4763 scope.go:117] "RemoveContainer" containerID="dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.551149 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c\": container with ID starting with dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c not found: ID does not exist" containerID="dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.551185 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c"} err="failed to get container status \"dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c\": rpc error: code = NotFound desc = could not find container \"dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c\": container with ID starting with dd254f442d354a5a5e4cf9663a456f5524d2e4c095feb3dff9295c8ab863c94c not found: ID does not exist" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.551202 4763 scope.go:117] "RemoveContainer" containerID="6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.551667 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76\": container with ID starting with 6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76 not found: ID does not exist" containerID="6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.551695 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76"} err="failed to get container status \"6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76\": rpc error: code = NotFound desc = could not find container \"6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76\": container with ID starting with 6419391164d31e67772c4abe570f9d349637ce304611278a0fdab6b95ac31f76 not found: ID does not exist" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.551711 4763 scope.go:117] "RemoveContainer" containerID="3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08" Dec 01 09:34:48 crc kubenswrapper[4763]: E1201 09:34:48.551956 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08\": container with ID starting with 3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08 not found: ID does not exist" containerID="3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.551992 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08"} err="failed to get container status \"3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08\": rpc error: code = NotFound desc = could not find container \"3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08\": container with ID starting with 3a7a0e2a389297beb7a6c87de88b8d2ea297fcbd2ebfd0411fe08770496f2e08 not found: ID does not exist" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.595760 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lpcc\" (UniqueName: \"kubernetes.io/projected/df4525a5-2599-48ed-aef4-c784f59d2f5b-kube-api-access-4lpcc\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.595833 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-config-data\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.595864 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-log-httpd\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.595900 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-run-httpd\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.596073 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.596116 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.596134 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-scripts\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.697595 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.698382 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.698937 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-scripts\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.698998 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lpcc\" (UniqueName: \"kubernetes.io/projected/df4525a5-2599-48ed-aef4-c784f59d2f5b-kube-api-access-4lpcc\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.699042 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-config-data\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.699067 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-log-httpd\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.699089 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-run-httpd\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.699720 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-run-httpd\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.700650 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-log-httpd\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.702368 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-scripts\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.702919 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.703704 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.707240 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-config-data\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.718954 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lpcc\" (UniqueName: \"kubernetes.io/projected/df4525a5-2599-48ed-aef4-c784f59d2f5b-kube-api-access-4lpcc\") pod \"ceilometer-0\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " pod="openstack/ceilometer-0" Dec 01 09:34:48 crc kubenswrapper[4763]: I1201 09:34:48.820308 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:49 crc kubenswrapper[4763]: I1201 09:34:49.018003 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523216bb-3506-4d7e-a7f7-7a50af4d838e" path="/var/lib/kubelet/pods/523216bb-3506-4d7e-a7f7-7a50af4d838e/volumes" Dec 01 09:34:49 crc kubenswrapper[4763]: I1201 09:34:49.286073 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:49 crc kubenswrapper[4763]: I1201 09:34:49.400734 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerStarted","Data":"1a9f7b723a11a212800a66546457851abc453c9f9bec8b32eadfb52f7a16d2d9"} Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.442565 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fmhwj"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.444239 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fmhwj"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.444319 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.549762 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445a892a-304f-4d8b-9163-78f5a1011a53-operator-scripts\") pod \"nova-api-db-create-fmhwj\" (UID: \"445a892a-304f-4d8b-9163-78f5a1011a53\") " pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.550067 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz9g4\" (UniqueName: \"kubernetes.io/projected/445a892a-304f-4d8b-9163-78f5a1011a53-kube-api-access-gz9g4\") pod \"nova-api-db-create-fmhwj\" (UID: \"445a892a-304f-4d8b-9163-78f5a1011a53\") " pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.597679 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vrlcm"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.598755 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.611550 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vrlcm"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.628525 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fd25-account-create-update-2qvtq"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.629618 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.633984 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.652363 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445a892a-304f-4d8b-9163-78f5a1011a53-operator-scripts\") pod \"nova-api-db-create-fmhwj\" (UID: \"445a892a-304f-4d8b-9163-78f5a1011a53\") " pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.652457 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz9g4\" (UniqueName: \"kubernetes.io/projected/445a892a-304f-4d8b-9163-78f5a1011a53-kube-api-access-gz9g4\") pod \"nova-api-db-create-fmhwj\" (UID: \"445a892a-304f-4d8b-9163-78f5a1011a53\") " pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.653601 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445a892a-304f-4d8b-9163-78f5a1011a53-operator-scripts\") pod \"nova-api-db-create-fmhwj\" (UID: \"445a892a-304f-4d8b-9163-78f5a1011a53\") " pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.657206 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fd25-account-create-update-2qvtq"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.711135 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz9g4\" (UniqueName: \"kubernetes.io/projected/445a892a-304f-4d8b-9163-78f5a1011a53-kube-api-access-gz9g4\") pod \"nova-api-db-create-fmhwj\" (UID: \"445a892a-304f-4d8b-9163-78f5a1011a53\") " pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.736630 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-sbf6n"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.737918 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.747110 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sbf6n"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.753791 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93979ddc-1fca-4d13-bad7-7123ec597957-operator-scripts\") pod \"nova-api-fd25-account-create-update-2qvtq\" (UID: \"93979ddc-1fca-4d13-bad7-7123ec597957\") " pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.753871 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ff2701-69e8-4c30-a9cd-76a9862849e3-operator-scripts\") pod \"nova-cell0-db-create-vrlcm\" (UID: \"83ff2701-69e8-4c30-a9cd-76a9862849e3\") " pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.754047 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jr2\" (UniqueName: \"kubernetes.io/projected/93979ddc-1fca-4d13-bad7-7123ec597957-kube-api-access-92jr2\") pod \"nova-api-fd25-account-create-update-2qvtq\" (UID: \"93979ddc-1fca-4d13-bad7-7123ec597957\") " pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.754084 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddppq\" (UniqueName: \"kubernetes.io/projected/83ff2701-69e8-4c30-a9cd-76a9862849e3-kube-api-access-ddppq\") pod \"nova-cell0-db-create-vrlcm\" (UID: \"83ff2701-69e8-4c30-a9cd-76a9862849e3\") " pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.800010 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6992-account-create-update-g66rw"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.801135 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.806126 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.817040 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6992-account-create-update-g66rw"] Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.819300 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.867544 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfsq6\" (UniqueName: \"kubernetes.io/projected/b45c06ea-1ae1-4524-a3a5-704562d6aaab-kube-api-access-tfsq6\") pod \"nova-cell1-db-create-sbf6n\" (UID: \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\") " pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.867603 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jr2\" (UniqueName: \"kubernetes.io/projected/93979ddc-1fca-4d13-bad7-7123ec597957-kube-api-access-92jr2\") pod \"nova-api-fd25-account-create-update-2qvtq\" (UID: \"93979ddc-1fca-4d13-bad7-7123ec597957\") " pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.867630 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddppq\" (UniqueName: \"kubernetes.io/projected/83ff2701-69e8-4c30-a9cd-76a9862849e3-kube-api-access-ddppq\") pod \"nova-cell0-db-create-vrlcm\" (UID: \"83ff2701-69e8-4c30-a9cd-76a9862849e3\") " pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.867654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93979ddc-1fca-4d13-bad7-7123ec597957-operator-scripts\") pod \"nova-api-fd25-account-create-update-2qvtq\" (UID: \"93979ddc-1fca-4d13-bad7-7123ec597957\") " pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.867685 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b45c06ea-1ae1-4524-a3a5-704562d6aaab-operator-scripts\") pod \"nova-cell1-db-create-sbf6n\" (UID: \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\") " pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.867706 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ff2701-69e8-4c30-a9cd-76a9862849e3-operator-scripts\") pod \"nova-cell0-db-create-vrlcm\" (UID: \"83ff2701-69e8-4c30-a9cd-76a9862849e3\") " pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.869259 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93979ddc-1fca-4d13-bad7-7123ec597957-operator-scripts\") pod \"nova-api-fd25-account-create-update-2qvtq\" (UID: \"93979ddc-1fca-4d13-bad7-7123ec597957\") " pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.869794 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ff2701-69e8-4c30-a9cd-76a9862849e3-operator-scripts\") pod \"nova-cell0-db-create-vrlcm\" (UID: \"83ff2701-69e8-4c30-a9cd-76a9862849e3\") " pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.895876 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jr2\" (UniqueName: \"kubernetes.io/projected/93979ddc-1fca-4d13-bad7-7123ec597957-kube-api-access-92jr2\") pod \"nova-api-fd25-account-create-update-2qvtq\" (UID: \"93979ddc-1fca-4d13-bad7-7123ec597957\") " pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.899111 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddppq\" (UniqueName: \"kubernetes.io/projected/83ff2701-69e8-4c30-a9cd-76a9862849e3-kube-api-access-ddppq\") pod \"nova-cell0-db-create-vrlcm\" (UID: \"83ff2701-69e8-4c30-a9cd-76a9862849e3\") " pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.926589 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.959414 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.970125 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-operator-scripts\") pod \"nova-cell0-6992-account-create-update-g66rw\" (UID: \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\") " pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.970179 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b45c06ea-1ae1-4524-a3a5-704562d6aaab-operator-scripts\") pod \"nova-cell1-db-create-sbf6n\" (UID: \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\") " pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.970281 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn4kb\" (UniqueName: \"kubernetes.io/projected/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-kube-api-access-jn4kb\") pod \"nova-cell0-6992-account-create-update-g66rw\" (UID: \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\") " pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.970421 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfsq6\" (UniqueName: \"kubernetes.io/projected/b45c06ea-1ae1-4524-a3a5-704562d6aaab-kube-api-access-tfsq6\") pod \"nova-cell1-db-create-sbf6n\" (UID: \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\") " pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:50 crc kubenswrapper[4763]: I1201 09:34:50.971739 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b45c06ea-1ae1-4524-a3a5-704562d6aaab-operator-scripts\") pod \"nova-cell1-db-create-sbf6n\" (UID: \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\") " pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.015306 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfsq6\" (UniqueName: \"kubernetes.io/projected/b45c06ea-1ae1-4524-a3a5-704562d6aaab-kube-api-access-tfsq6\") pod \"nova-cell1-db-create-sbf6n\" (UID: \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\") " pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.016958 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.017758 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-798f56bcd6-5xxds" podUID="b94d6350-7336-4142-8909-bb3c5e09412f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.033840 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3d32-account-create-update-shdxp"] Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.035102 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.038771 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.074764 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-operator-scripts\") pod \"nova-cell0-6992-account-create-update-g66rw\" (UID: \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\") " pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.074960 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn4kb\" (UniqueName: \"kubernetes.io/projected/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-kube-api-access-jn4kb\") pod \"nova-cell0-6992-account-create-update-g66rw\" (UID: \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\") " pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.106079 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.108402 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-operator-scripts\") pod \"nova-cell0-6992-account-create-update-g66rw\" (UID: \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\") " pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.130340 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3d32-account-create-update-shdxp"] Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.148530 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn4kb\" (UniqueName: \"kubernetes.io/projected/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-kube-api-access-jn4kb\") pod \"nova-cell0-6992-account-create-update-g66rw\" (UID: \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\") " pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.202731 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70433ec-65b9-46db-8068-4283eb38245a-operator-scripts\") pod \"nova-cell1-3d32-account-create-update-shdxp\" (UID: \"a70433ec-65b9-46db-8068-4283eb38245a\") " pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.203014 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqvb\" (UniqueName: \"kubernetes.io/projected/a70433ec-65b9-46db-8068-4283eb38245a-kube-api-access-4jqvb\") pod \"nova-cell1-3d32-account-create-update-shdxp\" (UID: \"a70433ec-65b9-46db-8068-4283eb38245a\") " pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.307320 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jqvb\" (UniqueName: \"kubernetes.io/projected/a70433ec-65b9-46db-8068-4283eb38245a-kube-api-access-4jqvb\") pod \"nova-cell1-3d32-account-create-update-shdxp\" (UID: \"a70433ec-65b9-46db-8068-4283eb38245a\") " pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.307401 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70433ec-65b9-46db-8068-4283eb38245a-operator-scripts\") pod \"nova-cell1-3d32-account-create-update-shdxp\" (UID: \"a70433ec-65b9-46db-8068-4283eb38245a\") " pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.308204 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70433ec-65b9-46db-8068-4283eb38245a-operator-scripts\") pod \"nova-cell1-3d32-account-create-update-shdxp\" (UID: \"a70433ec-65b9-46db-8068-4283eb38245a\") " pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.395134 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jqvb\" (UniqueName: \"kubernetes.io/projected/a70433ec-65b9-46db-8068-4283eb38245a-kube-api-access-4jqvb\") pod \"nova-cell1-3d32-account-create-update-shdxp\" (UID: \"a70433ec-65b9-46db-8068-4283eb38245a\") " pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.399178 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.429066 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.448868 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerStarted","Data":"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c"} Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.699338 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vrlcm"] Dec 01 09:34:51 crc kubenswrapper[4763]: I1201 09:34:51.742957 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fmhwj"] Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.080787 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fd25-account-create-update-2qvtq"] Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.119348 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-sbf6n"] Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.202146 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3d32-account-create-update-shdxp"] Dec 01 09:34:52 crc kubenswrapper[4763]: W1201 09:34:52.215658 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda70433ec_65b9_46db_8068_4283eb38245a.slice/crio-ede5cc06c706dd22881eb6c89f6bef826c5326e163bae84d1377e2713b1c2dd4 WatchSource:0}: Error finding container ede5cc06c706dd22881eb6c89f6bef826c5326e163bae84d1377e2713b1c2dd4: Status 404 returned error can't find the container with id ede5cc06c706dd22881eb6c89f6bef826c5326e163bae84d1377e2713b1c2dd4 Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.467755 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fmhwj" event={"ID":"445a892a-304f-4d8b-9163-78f5a1011a53","Type":"ContainerStarted","Data":"adf6a0703a06a8706c6f697716a2a1af6932a0bde37a76269af46c5a94b63797"} Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.468056 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fmhwj" event={"ID":"445a892a-304f-4d8b-9163-78f5a1011a53","Type":"ContainerStarted","Data":"90c01a0c6fb071e5f7b8624e22fba8c784e9f973e2136560308cc2b41099f7b9"} Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.474013 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3d32-account-create-update-shdxp" event={"ID":"a70433ec-65b9-46db-8068-4283eb38245a","Type":"ContainerStarted","Data":"ede5cc06c706dd22881eb6c89f6bef826c5326e163bae84d1377e2713b1c2dd4"} Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.488445 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sbf6n" event={"ID":"b45c06ea-1ae1-4524-a3a5-704562d6aaab","Type":"ContainerStarted","Data":"ee5658cc4ddeaa6e6935e23004181ec958e3ae24d88ca67eb4bd56c124ab0440"} Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.516241 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerStarted","Data":"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d"} Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.525605 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vrlcm" event={"ID":"83ff2701-69e8-4c30-a9cd-76a9862849e3","Type":"ContainerStarted","Data":"8eb18e37b6d89aa6c37b819e96bfcb945a3bfbf5d3ea26522115ef0f65f1e8a3"} Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.525663 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vrlcm" event={"ID":"83ff2701-69e8-4c30-a9cd-76a9862849e3","Type":"ContainerStarted","Data":"a63611993dbebb1318fcd2352f0706cbdee4ec2e005bc1ae2088d9c98b57e6e0"} Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.527029 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-fmhwj" podStartSLOduration=2.527002621 podStartE2EDuration="2.527002621s" podCreationTimestamp="2025-12-01 09:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:52.491645485 +0000 UTC m=+1209.760294263" watchObservedRunningTime="2025-12-01 09:34:52.527002621 +0000 UTC m=+1209.795651389" Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.534913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fd25-account-create-update-2qvtq" event={"ID":"93979ddc-1fca-4d13-bad7-7123ec597957","Type":"ContainerStarted","Data":"af59d828bb9fceccb0b9d226bd2bc1159c4a0af3fc06113f14bb3478a95447f3"} Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.564434 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6992-account-create-update-g66rw"] Dec 01 09:34:52 crc kubenswrapper[4763]: I1201 09:34:52.569890 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-vrlcm" podStartSLOduration=2.56987318 podStartE2EDuration="2.56987318s" podCreationTimestamp="2025-12-01 09:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:52.546376977 +0000 UTC m=+1209.815025735" watchObservedRunningTime="2025-12-01 09:34:52.56987318 +0000 UTC m=+1209.838521948" Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.554725 4763 generic.go:334] "Generic (PLEG): container finished" podID="445a892a-304f-4d8b-9163-78f5a1011a53" containerID="adf6a0703a06a8706c6f697716a2a1af6932a0bde37a76269af46c5a94b63797" exitCode=0 Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.554803 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fmhwj" event={"ID":"445a892a-304f-4d8b-9163-78f5a1011a53","Type":"ContainerDied","Data":"adf6a0703a06a8706c6f697716a2a1af6932a0bde37a76269af46c5a94b63797"} Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.556936 4763 generic.go:334] "Generic (PLEG): container finished" podID="a70433ec-65b9-46db-8068-4283eb38245a" containerID="060e281bc4aca80a2cc3bf1915e50c13eaaf69d34ba12237d8d225843aadee2a" exitCode=0 Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.556998 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3d32-account-create-update-shdxp" event={"ID":"a70433ec-65b9-46db-8068-4283eb38245a","Type":"ContainerDied","Data":"060e281bc4aca80a2cc3bf1915e50c13eaaf69d34ba12237d8d225843aadee2a"} Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.560938 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerStarted","Data":"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8"} Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.562576 4763 generic.go:334] "Generic (PLEG): container finished" podID="83ff2701-69e8-4c30-a9cd-76a9862849e3" containerID="8eb18e37b6d89aa6c37b819e96bfcb945a3bfbf5d3ea26522115ef0f65f1e8a3" exitCode=0 Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.562634 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vrlcm" event={"ID":"83ff2701-69e8-4c30-a9cd-76a9862849e3","Type":"ContainerDied","Data":"8eb18e37b6d89aa6c37b819e96bfcb945a3bfbf5d3ea26522115ef0f65f1e8a3"} Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.564547 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6992-account-create-update-g66rw" event={"ID":"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca","Type":"ContainerStarted","Data":"8ffa6b2dce43b3a67ba79e7c7ee917d1ee811d6804bbf8fb3c9a7a9f6d842d74"} Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.564608 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6992-account-create-update-g66rw" event={"ID":"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca","Type":"ContainerStarted","Data":"ca472fe291997eec4978117ff537bab0aaf300f05858bb6833dac15bab32e227"} Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.569149 4763 generic.go:334] "Generic (PLEG): container finished" podID="b45c06ea-1ae1-4524-a3a5-704562d6aaab" containerID="2c84d8c48283604e169a910c1b0a8ab513166332aec0973068d293fbd60f89a0" exitCode=0 Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.569198 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sbf6n" event={"ID":"b45c06ea-1ae1-4524-a3a5-704562d6aaab","Type":"ContainerDied","Data":"2c84d8c48283604e169a910c1b0a8ab513166332aec0973068d293fbd60f89a0"} Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.573234 4763 generic.go:334] "Generic (PLEG): container finished" podID="93979ddc-1fca-4d13-bad7-7123ec597957" containerID="0d5e1d579e04c894d2d13ec348b2bc0009855e6f729e9c0b4c6b59fb87e3c77b" exitCode=0 Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.573274 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fd25-account-create-update-2qvtq" event={"ID":"93979ddc-1fca-4d13-bad7-7123ec597957","Type":"ContainerDied","Data":"0d5e1d579e04c894d2d13ec348b2bc0009855e6f729e9c0b4c6b59fb87e3c77b"} Dec 01 09:34:53 crc kubenswrapper[4763]: I1201 09:34:53.640663 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-6992-account-create-update-g66rw" podStartSLOduration=3.640646451 podStartE2EDuration="3.640646451s" podCreationTimestamp="2025-12-01 09:34:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:34:53.637334598 +0000 UTC m=+1210.905983366" watchObservedRunningTime="2025-12-01 09:34:53.640646451 +0000 UTC m=+1210.909295219" Dec 01 09:34:54 crc kubenswrapper[4763]: I1201 09:34:54.611978 4763 generic.go:334] "Generic (PLEG): container finished" podID="6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca" containerID="8ffa6b2dce43b3a67ba79e7c7ee917d1ee811d6804bbf8fb3c9a7a9f6d842d74" exitCode=0 Dec 01 09:34:54 crc kubenswrapper[4763]: I1201 09:34:54.612200 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6992-account-create-update-g66rw" event={"ID":"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca","Type":"ContainerDied","Data":"8ffa6b2dce43b3a67ba79e7c7ee917d1ee811d6804bbf8fb3c9a7a9f6d842d74"} Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.231703 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.259075 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.269846 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.277045 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.331962 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376555 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jqvb\" (UniqueName: \"kubernetes.io/projected/a70433ec-65b9-46db-8068-4283eb38245a-kube-api-access-4jqvb\") pod \"a70433ec-65b9-46db-8068-4283eb38245a\" (UID: \"a70433ec-65b9-46db-8068-4283eb38245a\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376632 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ff2701-69e8-4c30-a9cd-76a9862849e3-operator-scripts\") pod \"83ff2701-69e8-4c30-a9cd-76a9862849e3\" (UID: \"83ff2701-69e8-4c30-a9cd-76a9862849e3\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376668 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93979ddc-1fca-4d13-bad7-7123ec597957-operator-scripts\") pod \"93979ddc-1fca-4d13-bad7-7123ec597957\" (UID: \"93979ddc-1fca-4d13-bad7-7123ec597957\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376722 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz9g4\" (UniqueName: \"kubernetes.io/projected/445a892a-304f-4d8b-9163-78f5a1011a53-kube-api-access-gz9g4\") pod \"445a892a-304f-4d8b-9163-78f5a1011a53\" (UID: \"445a892a-304f-4d8b-9163-78f5a1011a53\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376765 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfsq6\" (UniqueName: \"kubernetes.io/projected/b45c06ea-1ae1-4524-a3a5-704562d6aaab-kube-api-access-tfsq6\") pod \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\" (UID: \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376809 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b45c06ea-1ae1-4524-a3a5-704562d6aaab-operator-scripts\") pod \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\" (UID: \"b45c06ea-1ae1-4524-a3a5-704562d6aaab\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92jr2\" (UniqueName: \"kubernetes.io/projected/93979ddc-1fca-4d13-bad7-7123ec597957-kube-api-access-92jr2\") pod \"93979ddc-1fca-4d13-bad7-7123ec597957\" (UID: \"93979ddc-1fca-4d13-bad7-7123ec597957\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376862 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70433ec-65b9-46db-8068-4283eb38245a-operator-scripts\") pod \"a70433ec-65b9-46db-8068-4283eb38245a\" (UID: \"a70433ec-65b9-46db-8068-4283eb38245a\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376917 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddppq\" (UniqueName: \"kubernetes.io/projected/83ff2701-69e8-4c30-a9cd-76a9862849e3-kube-api-access-ddppq\") pod \"83ff2701-69e8-4c30-a9cd-76a9862849e3\" (UID: \"83ff2701-69e8-4c30-a9cd-76a9862849e3\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.376966 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445a892a-304f-4d8b-9163-78f5a1011a53-operator-scripts\") pod \"445a892a-304f-4d8b-9163-78f5a1011a53\" (UID: \"445a892a-304f-4d8b-9163-78f5a1011a53\") " Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.378108 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/445a892a-304f-4d8b-9163-78f5a1011a53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "445a892a-304f-4d8b-9163-78f5a1011a53" (UID: "445a892a-304f-4d8b-9163-78f5a1011a53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.378258 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93979ddc-1fca-4d13-bad7-7123ec597957-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93979ddc-1fca-4d13-bad7-7123ec597957" (UID: "93979ddc-1fca-4d13-bad7-7123ec597957"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.378917 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83ff2701-69e8-4c30-a9cd-76a9862849e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83ff2701-69e8-4c30-a9cd-76a9862849e3" (UID: "83ff2701-69e8-4c30-a9cd-76a9862849e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.379354 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a70433ec-65b9-46db-8068-4283eb38245a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a70433ec-65b9-46db-8068-4283eb38245a" (UID: "a70433ec-65b9-46db-8068-4283eb38245a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.384820 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b45c06ea-1ae1-4524-a3a5-704562d6aaab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b45c06ea-1ae1-4524-a3a5-704562d6aaab" (UID: "b45c06ea-1ae1-4524-a3a5-704562d6aaab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.393899 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ff2701-69e8-4c30-a9cd-76a9862849e3-kube-api-access-ddppq" (OuterVolumeSpecName: "kube-api-access-ddppq") pod "83ff2701-69e8-4c30-a9cd-76a9862849e3" (UID: "83ff2701-69e8-4c30-a9cd-76a9862849e3"). InnerVolumeSpecName "kube-api-access-ddppq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.398378 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70433ec-65b9-46db-8068-4283eb38245a-kube-api-access-4jqvb" (OuterVolumeSpecName: "kube-api-access-4jqvb") pod "a70433ec-65b9-46db-8068-4283eb38245a" (UID: "a70433ec-65b9-46db-8068-4283eb38245a"). InnerVolumeSpecName "kube-api-access-4jqvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.400269 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b45c06ea-1ae1-4524-a3a5-704562d6aaab-kube-api-access-tfsq6" (OuterVolumeSpecName: "kube-api-access-tfsq6") pod "b45c06ea-1ae1-4524-a3a5-704562d6aaab" (UID: "b45c06ea-1ae1-4524-a3a5-704562d6aaab"). InnerVolumeSpecName "kube-api-access-tfsq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.401967 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93979ddc-1fca-4d13-bad7-7123ec597957-kube-api-access-92jr2" (OuterVolumeSpecName: "kube-api-access-92jr2") pod "93979ddc-1fca-4d13-bad7-7123ec597957" (UID: "93979ddc-1fca-4d13-bad7-7123ec597957"). InnerVolumeSpecName "kube-api-access-92jr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.402116 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/445a892a-304f-4d8b-9163-78f5a1011a53-kube-api-access-gz9g4" (OuterVolumeSpecName: "kube-api-access-gz9g4") pod "445a892a-304f-4d8b-9163-78f5a1011a53" (UID: "445a892a-304f-4d8b-9163-78f5a1011a53"). InnerVolumeSpecName "kube-api-access-gz9g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.480705 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/445a892a-304f-4d8b-9163-78f5a1011a53-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.480957 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jqvb\" (UniqueName: \"kubernetes.io/projected/a70433ec-65b9-46db-8068-4283eb38245a-kube-api-access-4jqvb\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.481038 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83ff2701-69e8-4c30-a9cd-76a9862849e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.481144 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93979ddc-1fca-4d13-bad7-7123ec597957-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.481259 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz9g4\" (UniqueName: \"kubernetes.io/projected/445a892a-304f-4d8b-9163-78f5a1011a53-kube-api-access-gz9g4\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.481335 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfsq6\" (UniqueName: \"kubernetes.io/projected/b45c06ea-1ae1-4524-a3a5-704562d6aaab-kube-api-access-tfsq6\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.481403 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b45c06ea-1ae1-4524-a3a5-704562d6aaab-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.481498 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92jr2\" (UniqueName: \"kubernetes.io/projected/93979ddc-1fca-4d13-bad7-7123ec597957-kube-api-access-92jr2\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.481581 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70433ec-65b9-46db-8068-4283eb38245a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.481659 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddppq\" (UniqueName: \"kubernetes.io/projected/83ff2701-69e8-4c30-a9cd-76a9862849e3-kube-api-access-ddppq\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.623217 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-sbf6n" event={"ID":"b45c06ea-1ae1-4524-a3a5-704562d6aaab","Type":"ContainerDied","Data":"ee5658cc4ddeaa6e6935e23004181ec958e3ae24d88ca67eb4bd56c124ab0440"} Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.623256 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee5658cc4ddeaa6e6935e23004181ec958e3ae24d88ca67eb4bd56c124ab0440" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.623268 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-sbf6n" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.626694 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerStarted","Data":"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68"} Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.626916 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.630041 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vrlcm" event={"ID":"83ff2701-69e8-4c30-a9cd-76a9862849e3","Type":"ContainerDied","Data":"a63611993dbebb1318fcd2352f0706cbdee4ec2e005bc1ae2088d9c98b57e6e0"} Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.630071 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vrlcm" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.630087 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a63611993dbebb1318fcd2352f0706cbdee4ec2e005bc1ae2088d9c98b57e6e0" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.633124 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fd25-account-create-update-2qvtq" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.633198 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fd25-account-create-update-2qvtq" event={"ID":"93979ddc-1fca-4d13-bad7-7123ec597957","Type":"ContainerDied","Data":"af59d828bb9fceccb0b9d226bd2bc1159c4a0af3fc06113f14bb3478a95447f3"} Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.633243 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af59d828bb9fceccb0b9d226bd2bc1159c4a0af3fc06113f14bb3478a95447f3" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.635297 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fmhwj" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.635287 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fmhwj" event={"ID":"445a892a-304f-4d8b-9163-78f5a1011a53","Type":"ContainerDied","Data":"90c01a0c6fb071e5f7b8624e22fba8c784e9f973e2136560308cc2b41099f7b9"} Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.635435 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c01a0c6fb071e5f7b8624e22fba8c784e9f973e2136560308cc2b41099f7b9" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.639627 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3d32-account-create-update-shdxp" event={"ID":"a70433ec-65b9-46db-8068-4283eb38245a","Type":"ContainerDied","Data":"ede5cc06c706dd22881eb6c89f6bef826c5326e163bae84d1377e2713b1c2dd4"} Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.639675 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ede5cc06c706dd22881eb6c89f6bef826c5326e163bae84d1377e2713b1c2dd4" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.639683 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3d32-account-create-update-shdxp" Dec 01 09:34:55 crc kubenswrapper[4763]: I1201 09:34:55.660368 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.439060489 podStartE2EDuration="7.6603423s" podCreationTimestamp="2025-12-01 09:34:48 +0000 UTC" firstStartedPulling="2025-12-01 09:34:49.294999161 +0000 UTC m=+1206.563647929" lastFinishedPulling="2025-12-01 09:34:54.516280972 +0000 UTC m=+1211.784929740" observedRunningTime="2025-12-01 09:34:55.654613468 +0000 UTC m=+1212.923262236" watchObservedRunningTime="2025-12-01 09:34:55.6603423 +0000 UTC m=+1212.928991078" Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.162561 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.248290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn4kb\" (UniqueName: \"kubernetes.io/projected/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-kube-api-access-jn4kb\") pod \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\" (UID: \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\") " Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.248598 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-operator-scripts\") pod \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\" (UID: \"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca\") " Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.249234 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca" (UID: "6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.254706 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-kube-api-access-jn4kb" (OuterVolumeSpecName: "kube-api-access-jn4kb") pod "6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca" (UID: "6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca"). InnerVolumeSpecName "kube-api-access-jn4kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.365499 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.365767 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn4kb\" (UniqueName: \"kubernetes.io/projected/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca-kube-api-access-jn4kb\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.651797 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6992-account-create-update-g66rw" Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.657934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6992-account-create-update-g66rw" event={"ID":"6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca","Type":"ContainerDied","Data":"ca472fe291997eec4978117ff537bab0aaf300f05858bb6833dac15bab32e227"} Dec 01 09:34:56 crc kubenswrapper[4763]: I1201 09:34:56.658067 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca472fe291997eec4978117ff537bab0aaf300f05858bb6833dac15bab32e227" Dec 01 09:34:57 crc kubenswrapper[4763]: I1201 09:34:57.294234 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:57 crc kubenswrapper[4763]: I1201 09:34:57.659884 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="ceilometer-central-agent" containerID="cri-o://dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c" gracePeriod=30 Dec 01 09:34:57 crc kubenswrapper[4763]: I1201 09:34:57.660002 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="ceilometer-notification-agent" containerID="cri-o://16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d" gracePeriod=30 Dec 01 09:34:57 crc kubenswrapper[4763]: I1201 09:34:57.659965 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="sg-core" containerID="cri-o://ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8" gracePeriod=30 Dec 01 09:34:57 crc kubenswrapper[4763]: I1201 09:34:57.660141 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="proxy-httpd" containerID="cri-o://53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68" gracePeriod=30 Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.559876 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.625841 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-config-data\") pod \"df4525a5-2599-48ed-aef4-c784f59d2f5b\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.625966 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-log-httpd\") pod \"df4525a5-2599-48ed-aef4-c784f59d2f5b\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.625985 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-combined-ca-bundle\") pod \"df4525a5-2599-48ed-aef4-c784f59d2f5b\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.626021 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lpcc\" (UniqueName: \"kubernetes.io/projected/df4525a5-2599-48ed-aef4-c784f59d2f5b-kube-api-access-4lpcc\") pod \"df4525a5-2599-48ed-aef4-c784f59d2f5b\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.626048 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-scripts\") pod \"df4525a5-2599-48ed-aef4-c784f59d2f5b\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.626124 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-sg-core-conf-yaml\") pod \"df4525a5-2599-48ed-aef4-c784f59d2f5b\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.626249 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-run-httpd\") pod \"df4525a5-2599-48ed-aef4-c784f59d2f5b\" (UID: \"df4525a5-2599-48ed-aef4-c784f59d2f5b\") " Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.626723 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df4525a5-2599-48ed-aef4-c784f59d2f5b" (UID: "df4525a5-2599-48ed-aef4-c784f59d2f5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.629727 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df4525a5-2599-48ed-aef4-c784f59d2f5b" (UID: "df4525a5-2599-48ed-aef4-c784f59d2f5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.643121 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-scripts" (OuterVolumeSpecName: "scripts") pod "df4525a5-2599-48ed-aef4-c784f59d2f5b" (UID: "df4525a5-2599-48ed-aef4-c784f59d2f5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.655961 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4525a5-2599-48ed-aef4-c784f59d2f5b-kube-api-access-4lpcc" (OuterVolumeSpecName: "kube-api-access-4lpcc") pod "df4525a5-2599-48ed-aef4-c784f59d2f5b" (UID: "df4525a5-2599-48ed-aef4-c784f59d2f5b"). InnerVolumeSpecName "kube-api-access-4lpcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.678974 4763 generic.go:334] "Generic (PLEG): container finished" podID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerID="53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68" exitCode=0 Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679031 4763 generic.go:334] "Generic (PLEG): container finished" podID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerID="ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8" exitCode=2 Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679041 4763 generic.go:334] "Generic (PLEG): container finished" podID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerID="16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d" exitCode=0 Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679049 4763 generic.go:334] "Generic (PLEG): container finished" podID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerID="dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c" exitCode=0 Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679069 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerDied","Data":"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68"} Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679094 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerDied","Data":"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8"} Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679164 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerDied","Data":"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d"} Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679176 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerDied","Data":"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c"} Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df4525a5-2599-48ed-aef4-c784f59d2f5b","Type":"ContainerDied","Data":"1a9f7b723a11a212800a66546457851abc453c9f9bec8b32eadfb52f7a16d2d9"} Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679209 4763 scope.go:117] "RemoveContainer" containerID="53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.679229 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.728480 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.728518 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df4525a5-2599-48ed-aef4-c784f59d2f5b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.728552 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lpcc\" (UniqueName: \"kubernetes.io/projected/df4525a5-2599-48ed-aef4-c784f59d2f5b-kube-api-access-4lpcc\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.728564 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.734777 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df4525a5-2599-48ed-aef4-c784f59d2f5b" (UID: "df4525a5-2599-48ed-aef4-c784f59d2f5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.738633 4763 scope.go:117] "RemoveContainer" containerID="ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.786713 4763 scope.go:117] "RemoveContainer" containerID="16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.832624 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.910591 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df4525a5-2599-48ed-aef4-c784f59d2f5b" (UID: "df4525a5-2599-48ed-aef4-c784f59d2f5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.915488 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-config-data" (OuterVolumeSpecName: "config-data") pod "df4525a5-2599-48ed-aef4-c784f59d2f5b" (UID: "df4525a5-2599-48ed-aef4-c784f59d2f5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.934237 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:58 crc kubenswrapper[4763]: I1201 09:34:58.934272 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df4525a5-2599-48ed-aef4-c784f59d2f5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.000613 4763 scope.go:117] "RemoveContainer" containerID="dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.020303 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.028688 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.035073 4763 scope.go:117] "RemoveContainer" containerID="53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.035534 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": container with ID starting with 53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68 not found: ID does not exist" containerID="53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.035563 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68"} err="failed to get container status \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": rpc error: code = NotFound desc = could not find container \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": container with ID starting with 53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68 not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.035586 4763 scope.go:117] "RemoveContainer" containerID="ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.035859 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": container with ID starting with ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8 not found: ID does not exist" containerID="ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.035965 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8"} err="failed to get container status \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": rpc error: code = NotFound desc = could not find container \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": container with ID starting with ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8 not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.036056 4763 scope.go:117] "RemoveContainer" containerID="16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.036451 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": container with ID starting with 16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d not found: ID does not exist" containerID="16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.036556 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d"} err="failed to get container status \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": rpc error: code = NotFound desc = could not find container \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": container with ID starting with 16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.036572 4763 scope.go:117] "RemoveContainer" containerID="dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.036769 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": container with ID starting with dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c not found: ID does not exist" containerID="dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.036796 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c"} err="failed to get container status \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": rpc error: code = NotFound desc = could not find container \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": container with ID starting with dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.036810 4763 scope.go:117] "RemoveContainer" containerID="53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.036966 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68"} err="failed to get container status \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": rpc error: code = NotFound desc = could not find container \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": container with ID starting with 53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68 not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.036985 4763 scope.go:117] "RemoveContainer" containerID="ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.037157 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8"} err="failed to get container status \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": rpc error: code = NotFound desc = could not find container \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": container with ID starting with ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8 not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.037174 4763 scope.go:117] "RemoveContainer" containerID="16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.037406 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d"} err="failed to get container status \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": rpc error: code = NotFound desc = could not find container \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": container with ID starting with 16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.037423 4763 scope.go:117] "RemoveContainer" containerID="dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.037621 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c"} err="failed to get container status \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": rpc error: code = NotFound desc = could not find container \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": container with ID starting with dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.037644 4763 scope.go:117] "RemoveContainer" containerID="53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.037814 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68"} err="failed to get container status \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": rpc error: code = NotFound desc = could not find container \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": container with ID starting with 53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68 not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.037849 4763 scope.go:117] "RemoveContainer" containerID="ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.039359 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8"} err="failed to get container status \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": rpc error: code = NotFound desc = could not find container \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": container with ID starting with ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8 not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.039404 4763 scope.go:117] "RemoveContainer" containerID="16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.039715 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d"} err="failed to get container status \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": rpc error: code = NotFound desc = could not find container \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": container with ID starting with 16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.039747 4763 scope.go:117] "RemoveContainer" containerID="dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.039956 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c"} err="failed to get container status \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": rpc error: code = NotFound desc = could not find container \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": container with ID starting with dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.039980 4763 scope.go:117] "RemoveContainer" containerID="53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.040182 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68"} err="failed to get container status \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": rpc error: code = NotFound desc = could not find container \"53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68\": container with ID starting with 53a7925563f5190ada5eb1b087a9903e85ac784e3a0eb250fcd230f253a03b68 not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.040202 4763 scope.go:117] "RemoveContainer" containerID="ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.042945 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8"} err="failed to get container status \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": rpc error: code = NotFound desc = could not find container \"ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8\": container with ID starting with ddd77df131bc32973617567be1b28ac32019696e7ebfdebb71ddbdc7122d0ea8 not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.042980 4763 scope.go:117] "RemoveContainer" containerID="16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.043482 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d"} err="failed to get container status \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": rpc error: code = NotFound desc = could not find container \"16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d\": container with ID starting with 16122c2106514176b21d674e5e3a6362b68b3cb993aa36110547bd2b69325b1d not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.043670 4763 scope.go:117] "RemoveContainer" containerID="dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.044641 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c"} err="failed to get container status \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": rpc error: code = NotFound desc = could not find container \"dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c\": container with ID starting with dd73b5b88a73cc144e4a6f704e5c1aa41c00b9ab1b9a0a4ac6804472f4da4c0c not found: ID does not exist" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.051444 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052399 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="proxy-httpd" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052420 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="proxy-httpd" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052452 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="sg-core" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052483 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="sg-core" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052504 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="ceilometer-notification-agent" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052510 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="ceilometer-notification-agent" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052525 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ff2701-69e8-4c30-a9cd-76a9862849e3" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052531 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ff2701-69e8-4c30-a9cd-76a9862849e3" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052551 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70433ec-65b9-46db-8068-4283eb38245a" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052557 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70433ec-65b9-46db-8068-4283eb38245a" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052576 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445a892a-304f-4d8b-9163-78f5a1011a53" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052583 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="445a892a-304f-4d8b-9163-78f5a1011a53" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052608 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052614 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052627 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="ceilometer-central-agent" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052633 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="ceilometer-central-agent" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052644 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93979ddc-1fca-4d13-bad7-7123ec597957" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052653 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="93979ddc-1fca-4d13-bad7-7123ec597957" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: E1201 09:34:59.052691 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b45c06ea-1ae1-4524-a3a5-704562d6aaab" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.052702 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45c06ea-1ae1-4524-a3a5-704562d6aaab" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053081 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="445a892a-304f-4d8b-9163-78f5a1011a53" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053101 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="proxy-httpd" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053110 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ff2701-69e8-4c30-a9cd-76a9862849e3" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053129 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053147 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70433ec-65b9-46db-8068-4283eb38245a" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053161 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="sg-core" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053178 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="ceilometer-notification-agent" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053196 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b45c06ea-1ae1-4524-a3a5-704562d6aaab" containerName="mariadb-database-create" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053205 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="93979ddc-1fca-4d13-bad7-7123ec597957" containerName="mariadb-account-create-update" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.053221 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" containerName="ceilometer-central-agent" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.057450 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.061692 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.061963 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.082776 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.137177 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-config-data\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.137265 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86vcg\" (UniqueName: \"kubernetes.io/projected/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-kube-api-access-86vcg\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.137308 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.137323 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.137342 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-run-httpd\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.137587 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-scripts\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.137869 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-log-httpd\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.239103 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-log-httpd\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.239521 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-config-data\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.239743 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-log-httpd\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.239555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86vcg\" (UniqueName: \"kubernetes.io/projected/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-kube-api-access-86vcg\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.240373 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.240404 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.240436 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-run-httpd\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.240511 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-scripts\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.241143 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-run-httpd\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.246071 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.247186 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-config-data\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.255112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.261490 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-scripts\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.262143 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86vcg\" (UniqueName: \"kubernetes.io/projected/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-kube-api-access-86vcg\") pod \"ceilometer-0\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.377580 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:34:59 crc kubenswrapper[4763]: I1201 09:34:59.826350 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:00 crc kubenswrapper[4763]: I1201 09:35:00.714749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerStarted","Data":"7854bbf9e2271dc4bbad9ef12eefce2117a113b2298cd26efd5099ef1def05c2"} Dec 01 09:35:00 crc kubenswrapper[4763]: I1201 09:35:00.715416 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerStarted","Data":"38c5d5ecee71a728ace338c0e1ee9deab126aaff9226a0a231337b47ac61457e"} Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.004233 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4525a5-2599-48ed-aef4-c784f59d2f5b" path="/var/lib/kubelet/pods/df4525a5-2599-48ed-aef4-c784f59d2f5b/volumes" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.197046 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b92hz"] Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.198397 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.200915 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.201249 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.201393 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rp477" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.215931 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b92hz"] Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.275974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-scripts\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.276069 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twk2d\" (UniqueName: \"kubernetes.io/projected/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-kube-api-access-twk2d\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.276256 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.276489 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-config-data\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.378016 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twk2d\" (UniqueName: \"kubernetes.io/projected/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-kube-api-access-twk2d\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.378108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.378190 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-config-data\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.378242 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-scripts\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.388891 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-scripts\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.389729 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-config-data\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.399770 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.401879 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twk2d\" (UniqueName: \"kubernetes.io/projected/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-kube-api-access-twk2d\") pod \"nova-cell0-conductor-db-sync-b92hz\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.513882 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:01 crc kubenswrapper[4763]: I1201 09:35:01.730932 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerStarted","Data":"eff6429a7b6b3d46ba974b5522d09d2371306e181db8dc5bea7e1feecf8cfbe7"} Dec 01 09:35:02 crc kubenswrapper[4763]: I1201 09:35:02.009448 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b92hz"] Dec 01 09:35:02 crc kubenswrapper[4763]: W1201 09:35:02.018200 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2da701d9_7efc_4ae7_bcfe_eeed1e7312a2.slice/crio-602f34e636eb50c8a1f0e6401a293f8a615a7824e55ea9bdf4b6c221394d7421 WatchSource:0}: Error finding container 602f34e636eb50c8a1f0e6401a293f8a615a7824e55ea9bdf4b6c221394d7421: Status 404 returned error can't find the container with id 602f34e636eb50c8a1f0e6401a293f8a615a7824e55ea9bdf4b6c221394d7421 Dec 01 09:35:02 crc kubenswrapper[4763]: I1201 09:35:02.755336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-b92hz" event={"ID":"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2","Type":"ContainerStarted","Data":"602f34e636eb50c8a1f0e6401a293f8a615a7824e55ea9bdf4b6c221394d7421"} Dec 01 09:35:03 crc kubenswrapper[4763]: I1201 09:35:03.766095 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerStarted","Data":"44682a2e2127efab61b3dedfea24929f4c9b67f0afe6ff70082defa619e9ebae"} Dec 01 09:35:03 crc kubenswrapper[4763]: I1201 09:35:03.929550 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:35:03 crc kubenswrapper[4763]: I1201 09:35:03.929614 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:35:05 crc kubenswrapper[4763]: I1201 09:35:05.810321 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerStarted","Data":"873582c9e0789e9a3d11625ff59a2fc9a91e314ca2dbcce83ca904915ec7fb4b"} Dec 01 09:35:05 crc kubenswrapper[4763]: I1201 09:35:05.810729 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:35:05 crc kubenswrapper[4763]: I1201 09:35:05.812658 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:05 crc kubenswrapper[4763]: I1201 09:35:05.835951 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.957616824 podStartE2EDuration="6.835934877s" podCreationTimestamp="2025-12-01 09:34:59 +0000 UTC" firstStartedPulling="2025-12-01 09:34:59.839879908 +0000 UTC m=+1217.108528676" lastFinishedPulling="2025-12-01 09:35:04.718197961 +0000 UTC m=+1221.986846729" observedRunningTime="2025-12-01 09:35:05.835079073 +0000 UTC m=+1223.103727861" watchObservedRunningTime="2025-12-01 09:35:05.835934877 +0000 UTC m=+1223.104583645" Dec 01 09:35:06 crc kubenswrapper[4763]: I1201 09:35:06.822349 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="ceilometer-central-agent" containerID="cri-o://7854bbf9e2271dc4bbad9ef12eefce2117a113b2298cd26efd5099ef1def05c2" gracePeriod=30 Dec 01 09:35:06 crc kubenswrapper[4763]: I1201 09:35:06.822870 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="proxy-httpd" containerID="cri-o://873582c9e0789e9a3d11625ff59a2fc9a91e314ca2dbcce83ca904915ec7fb4b" gracePeriod=30 Dec 01 09:35:06 crc kubenswrapper[4763]: I1201 09:35:06.822978 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="ceilometer-notification-agent" containerID="cri-o://eff6429a7b6b3d46ba974b5522d09d2371306e181db8dc5bea7e1feecf8cfbe7" gracePeriod=30 Dec 01 09:35:06 crc kubenswrapper[4763]: I1201 09:35:06.823030 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="sg-core" containerID="cri-o://44682a2e2127efab61b3dedfea24929f4c9b67f0afe6ff70082defa619e9ebae" gracePeriod=30 Dec 01 09:35:07 crc kubenswrapper[4763]: I1201 09:35:07.837631 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerID="873582c9e0789e9a3d11625ff59a2fc9a91e314ca2dbcce83ca904915ec7fb4b" exitCode=0 Dec 01 09:35:07 crc kubenswrapper[4763]: I1201 09:35:07.837960 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerID="44682a2e2127efab61b3dedfea24929f4c9b67f0afe6ff70082defa619e9ebae" exitCode=2 Dec 01 09:35:07 crc kubenswrapper[4763]: I1201 09:35:07.837691 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerDied","Data":"873582c9e0789e9a3d11625ff59a2fc9a91e314ca2dbcce83ca904915ec7fb4b"} Dec 01 09:35:07 crc kubenswrapper[4763]: I1201 09:35:07.838004 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerDied","Data":"44682a2e2127efab61b3dedfea24929f4c9b67f0afe6ff70082defa619e9ebae"} Dec 01 09:35:07 crc kubenswrapper[4763]: I1201 09:35:07.838018 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerDied","Data":"eff6429a7b6b3d46ba974b5522d09d2371306e181db8dc5bea7e1feecf8cfbe7"} Dec 01 09:35:07 crc kubenswrapper[4763]: I1201 09:35:07.837971 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerID="eff6429a7b6b3d46ba974b5522d09d2371306e181db8dc5bea7e1feecf8cfbe7" exitCode=0 Dec 01 09:35:10 crc kubenswrapper[4763]: I1201 09:35:10.864282 4763 generic.go:334] "Generic (PLEG): container finished" podID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerID="7854bbf9e2271dc4bbad9ef12eefce2117a113b2298cd26efd5099ef1def05c2" exitCode=0 Dec 01 09:35:10 crc kubenswrapper[4763]: I1201 09:35:10.864359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerDied","Data":"7854bbf9e2271dc4bbad9ef12eefce2117a113b2298cd26efd5099ef1def05c2"} Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.856032 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.890319 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-scripts\") pod \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.890515 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-config-data\") pod \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.890593 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-combined-ca-bundle\") pod \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.890627 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-log-httpd\") pod \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.890657 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-run-httpd\") pod \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.890682 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-sg-core-conf-yaml\") pod \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.890722 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86vcg\" (UniqueName: \"kubernetes.io/projected/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-kube-api-access-86vcg\") pod \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\" (UID: \"8ba6f6a5-18ea-4311-8a6b-7c4008b09257\") " Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.898821 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-kube-api-access-86vcg" (OuterVolumeSpecName: "kube-api-access-86vcg") pod "8ba6f6a5-18ea-4311-8a6b-7c4008b09257" (UID: "8ba6f6a5-18ea-4311-8a6b-7c4008b09257"). InnerVolumeSpecName "kube-api-access-86vcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.899153 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8ba6f6a5-18ea-4311-8a6b-7c4008b09257" (UID: "8ba6f6a5-18ea-4311-8a6b-7c4008b09257"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.899292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8ba6f6a5-18ea-4311-8a6b-7c4008b09257" (UID: "8ba6f6a5-18ea-4311-8a6b-7c4008b09257"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.905272 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-scripts" (OuterVolumeSpecName: "scripts") pod "8ba6f6a5-18ea-4311-8a6b-7c4008b09257" (UID: "8ba6f6a5-18ea-4311-8a6b-7c4008b09257"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.912442 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-b92hz" event={"ID":"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2","Type":"ContainerStarted","Data":"739b1c44ff01e8daad6a4526a7dab3934bd0772fff7a63a3245992da32155aba"} Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.920965 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8ba6f6a5-18ea-4311-8a6b-7c4008b09257","Type":"ContainerDied","Data":"38c5d5ecee71a728ace338c0e1ee9deab126aaff9226a0a231337b47ac61457e"} Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.921194 4763 scope.go:117] "RemoveContainer" containerID="873582c9e0789e9a3d11625ff59a2fc9a91e314ca2dbcce83ca904915ec7fb4b" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.921485 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.965624 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8ba6f6a5-18ea-4311-8a6b-7c4008b09257" (UID: "8ba6f6a5-18ea-4311-8a6b-7c4008b09257"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.969828 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-b92hz" podStartSLOduration=1.396245363 podStartE2EDuration="10.969801652s" podCreationTimestamp="2025-12-01 09:35:01 +0000 UTC" firstStartedPulling="2025-12-01 09:35:02.02080822 +0000 UTC m=+1219.289456988" lastFinishedPulling="2025-12-01 09:35:11.594364509 +0000 UTC m=+1228.863013277" observedRunningTime="2025-12-01 09:35:11.936204445 +0000 UTC m=+1229.204853213" watchObservedRunningTime="2025-12-01 09:35:11.969801652 +0000 UTC m=+1229.238450410" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.992388 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.992415 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.992425 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.992435 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86vcg\" (UniqueName: \"kubernetes.io/projected/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-kube-api-access-86vcg\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:11 crc kubenswrapper[4763]: I1201 09:35:11.992492 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.009126 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ba6f6a5-18ea-4311-8a6b-7c4008b09257" (UID: "8ba6f6a5-18ea-4311-8a6b-7c4008b09257"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.059045 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-config-data" (OuterVolumeSpecName: "config-data") pod "8ba6f6a5-18ea-4311-8a6b-7c4008b09257" (UID: "8ba6f6a5-18ea-4311-8a6b-7c4008b09257"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.090929 4763 scope.go:117] "RemoveContainer" containerID="44682a2e2127efab61b3dedfea24929f4c9b67f0afe6ff70082defa619e9ebae" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.093891 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.093926 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ba6f6a5-18ea-4311-8a6b-7c4008b09257-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.114444 4763 scope.go:117] "RemoveContainer" containerID="eff6429a7b6b3d46ba974b5522d09d2371306e181db8dc5bea7e1feecf8cfbe7" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.137438 4763 scope.go:117] "RemoveContainer" containerID="7854bbf9e2271dc4bbad9ef12eefce2117a113b2298cd26efd5099ef1def05c2" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.260598 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.276021 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.295393 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:12 crc kubenswrapper[4763]: E1201 09:35:12.295956 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="sg-core" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.295985 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="sg-core" Dec 01 09:35:12 crc kubenswrapper[4763]: E1201 09:35:12.296016 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="ceilometer-notification-agent" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.296027 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="ceilometer-notification-agent" Dec 01 09:35:12 crc kubenswrapper[4763]: E1201 09:35:12.296039 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="ceilometer-central-agent" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.296046 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="ceilometer-central-agent" Dec 01 09:35:12 crc kubenswrapper[4763]: E1201 09:35:12.296094 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="proxy-httpd" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.296101 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="proxy-httpd" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.296504 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="sg-core" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.296552 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="ceilometer-central-agent" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.296567 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="proxy-httpd" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.296575 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" containerName="ceilometer-notification-agent" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.300188 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.310354 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.310686 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.312161 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:35:12 crc kubenswrapper[4763]: E1201 09:35:12.332735 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ba6f6a5_18ea_4311_8a6b_7c4008b09257.slice\": RecentStats: unable to find data in memory cache]" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.397542 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksngw\" (UniqueName: \"kubernetes.io/projected/a8014a1f-3a15-46c9-a4a6-f605b575ab64-kube-api-access-ksngw\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.397605 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.397630 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.397678 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-scripts\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.397966 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-run-httpd\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.398032 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-config-data\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.398089 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-log-httpd\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.499746 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.500406 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-scripts\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.500629 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-run-httpd\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.500696 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-config-data\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.500775 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-log-httpd\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.500888 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksngw\" (UniqueName: \"kubernetes.io/projected/a8014a1f-3a15-46c9-a4a6-f605b575ab64-kube-api-access-ksngw\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.501049 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.501226 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-run-httpd\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.501246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-log-httpd\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.505025 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-scripts\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.505112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.507195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.509678 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-config-data\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.516909 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksngw\" (UniqueName: \"kubernetes.io/projected/a8014a1f-3a15-46c9-a4a6-f605b575ab64-kube-api-access-ksngw\") pod \"ceilometer-0\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " pod="openstack/ceilometer-0" Dec 01 09:35:12 crc kubenswrapper[4763]: I1201 09:35:12.642265 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:35:13 crc kubenswrapper[4763]: I1201 09:35:13.004648 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba6f6a5-18ea-4311-8a6b-7c4008b09257" path="/var/lib/kubelet/pods/8ba6f6a5-18ea-4311-8a6b-7c4008b09257/volumes" Dec 01 09:35:13 crc kubenswrapper[4763]: I1201 09:35:13.108959 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:13 crc kubenswrapper[4763]: I1201 09:35:13.939882 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerStarted","Data":"ce3ce4f406e315b1be4d9c8ed3bc3b9ecf89b6b4feb2507cc70bb8037b4a5c4c"} Dec 01 09:35:14 crc kubenswrapper[4763]: I1201 09:35:14.949606 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerStarted","Data":"bba12f2a86fbc8fec3626434060bfc0dc89ddfbcba9d1f217c2fb95ab958250e"} Dec 01 09:35:14 crc kubenswrapper[4763]: I1201 09:35:14.949906 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerStarted","Data":"d13118aa8280628af60ab4d691b7a42671a606e4fc02cc84cb6f438ca243d27a"} Dec 01 09:35:15 crc kubenswrapper[4763]: I1201 09:35:15.961138 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerStarted","Data":"f1704596ff855fb863d7628ace0532deafb49a0ed6ec417092931bfbde9e54d1"} Dec 01 09:35:18 crc kubenswrapper[4763]: I1201 09:35:18.991797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerStarted","Data":"ded73433730435433b6984b65fb799a4a0ea103c6eaf8ebff02487eca146013a"} Dec 01 09:35:19 crc kubenswrapper[4763]: I1201 09:35:19.004698 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:35:19 crc kubenswrapper[4763]: I1201 09:35:19.022410 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.521048846 podStartE2EDuration="7.022387531s" podCreationTimestamp="2025-12-01 09:35:12 +0000 UTC" firstStartedPulling="2025-12-01 09:35:13.100630866 +0000 UTC m=+1230.369279644" lastFinishedPulling="2025-12-01 09:35:18.601969561 +0000 UTC m=+1235.870618329" observedRunningTime="2025-12-01 09:35:19.02054909 +0000 UTC m=+1236.289197858" watchObservedRunningTime="2025-12-01 09:35:19.022387531 +0000 UTC m=+1236.291036299" Dec 01 09:35:23 crc kubenswrapper[4763]: I1201 09:35:23.213575 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:23 crc kubenswrapper[4763]: I1201 09:35:23.214341 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="ceilometer-central-agent" containerID="cri-o://d13118aa8280628af60ab4d691b7a42671a606e4fc02cc84cb6f438ca243d27a" gracePeriod=30 Dec 01 09:35:23 crc kubenswrapper[4763]: I1201 09:35:23.214375 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="proxy-httpd" containerID="cri-o://ded73433730435433b6984b65fb799a4a0ea103c6eaf8ebff02487eca146013a" gracePeriod=30 Dec 01 09:35:23 crc kubenswrapper[4763]: I1201 09:35:23.214509 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="sg-core" containerID="cri-o://f1704596ff855fb863d7628ace0532deafb49a0ed6ec417092931bfbde9e54d1" gracePeriod=30 Dec 01 09:35:23 crc kubenswrapper[4763]: I1201 09:35:23.214563 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="ceilometer-notification-agent" containerID="cri-o://bba12f2a86fbc8fec3626434060bfc0dc89ddfbcba9d1f217c2fb95ab958250e" gracePeriod=30 Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.037843 4763 generic.go:334] "Generic (PLEG): container finished" podID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerID="ded73433730435433b6984b65fb799a4a0ea103c6eaf8ebff02487eca146013a" exitCode=0 Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.038127 4763 generic.go:334] "Generic (PLEG): container finished" podID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerID="f1704596ff855fb863d7628ace0532deafb49a0ed6ec417092931bfbde9e54d1" exitCode=2 Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.038137 4763 generic.go:334] "Generic (PLEG): container finished" podID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerID="bba12f2a86fbc8fec3626434060bfc0dc89ddfbcba9d1f217c2fb95ab958250e" exitCode=0 Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.038144 4763 generic.go:334] "Generic (PLEG): container finished" podID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerID="d13118aa8280628af60ab4d691b7a42671a606e4fc02cc84cb6f438ca243d27a" exitCode=0 Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.037922 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerDied","Data":"ded73433730435433b6984b65fb799a4a0ea103c6eaf8ebff02487eca146013a"} Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.038210 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerDied","Data":"f1704596ff855fb863d7628ace0532deafb49a0ed6ec417092931bfbde9e54d1"} Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.038224 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerDied","Data":"bba12f2a86fbc8fec3626434060bfc0dc89ddfbcba9d1f217c2fb95ab958250e"} Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.038234 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerDied","Data":"d13118aa8280628af60ab4d691b7a42671a606e4fc02cc84cb6f438ca243d27a"} Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.043596 4763 generic.go:334] "Generic (PLEG): container finished" podID="2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" containerID="739b1c44ff01e8daad6a4526a7dab3934bd0772fff7a63a3245992da32155aba" exitCode=0 Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.043622 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-b92hz" event={"ID":"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2","Type":"ContainerDied","Data":"739b1c44ff01e8daad6a4526a7dab3934bd0772fff7a63a3245992da32155aba"} Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.165098 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.316472 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-scripts\") pod \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.316930 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-config-data\") pod \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.317088 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-log-httpd\") pod \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.317162 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksngw\" (UniqueName: \"kubernetes.io/projected/a8014a1f-3a15-46c9-a4a6-f605b575ab64-kube-api-access-ksngw\") pod \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.317248 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-run-httpd\") pod \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.317960 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-sg-core-conf-yaml\") pod \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.317600 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a8014a1f-3a15-46c9-a4a6-f605b575ab64" (UID: "a8014a1f-3a15-46c9-a4a6-f605b575ab64"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.317624 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a8014a1f-3a15-46c9-a4a6-f605b575ab64" (UID: "a8014a1f-3a15-46c9-a4a6-f605b575ab64"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.318034 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-combined-ca-bundle\") pod \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\" (UID: \"a8014a1f-3a15-46c9-a4a6-f605b575ab64\") " Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.318648 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.318680 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8014a1f-3a15-46c9-a4a6-f605b575ab64-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.323289 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-scripts" (OuterVolumeSpecName: "scripts") pod "a8014a1f-3a15-46c9-a4a6-f605b575ab64" (UID: "a8014a1f-3a15-46c9-a4a6-f605b575ab64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.323694 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8014a1f-3a15-46c9-a4a6-f605b575ab64-kube-api-access-ksngw" (OuterVolumeSpecName: "kube-api-access-ksngw") pod "a8014a1f-3a15-46c9-a4a6-f605b575ab64" (UID: "a8014a1f-3a15-46c9-a4a6-f605b575ab64"). InnerVolumeSpecName "kube-api-access-ksngw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.351520 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a8014a1f-3a15-46c9-a4a6-f605b575ab64" (UID: "a8014a1f-3a15-46c9-a4a6-f605b575ab64"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.412703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8014a1f-3a15-46c9-a4a6-f605b575ab64" (UID: "a8014a1f-3a15-46c9-a4a6-f605b575ab64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.420431 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksngw\" (UniqueName: \"kubernetes.io/projected/a8014a1f-3a15-46c9-a4a6-f605b575ab64-kube-api-access-ksngw\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.420547 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.421090 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.421110 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.437087 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-config-data" (OuterVolumeSpecName: "config-data") pod "a8014a1f-3a15-46c9-a4a6-f605b575ab64" (UID: "a8014a1f-3a15-46c9-a4a6-f605b575ab64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:24 crc kubenswrapper[4763]: I1201 09:35:24.522822 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8014a1f-3a15-46c9-a4a6-f605b575ab64-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.060499 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8014a1f-3a15-46c9-a4a6-f605b575ab64","Type":"ContainerDied","Data":"ce3ce4f406e315b1be4d9c8ed3bc3b9ecf89b6b4feb2507cc70bb8037b4a5c4c"} Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.060532 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.060572 4763 scope.go:117] "RemoveContainer" containerID="ded73433730435433b6984b65fb799a4a0ea103c6eaf8ebff02487eca146013a" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.095940 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.096212 4763 scope.go:117] "RemoveContainer" containerID="f1704596ff855fb863d7628ace0532deafb49a0ed6ec417092931bfbde9e54d1" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.105422 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.122117 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:25 crc kubenswrapper[4763]: E1201 09:35:25.123999 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="sg-core" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.124024 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="sg-core" Dec 01 09:35:25 crc kubenswrapper[4763]: E1201 09:35:25.124034 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="proxy-httpd" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.124040 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="proxy-httpd" Dec 01 09:35:25 crc kubenswrapper[4763]: E1201 09:35:25.124053 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="ceilometer-central-agent" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.124059 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="ceilometer-central-agent" Dec 01 09:35:25 crc kubenswrapper[4763]: E1201 09:35:25.124076 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="ceilometer-notification-agent" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.124082 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="ceilometer-notification-agent" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.124239 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="ceilometer-notification-agent" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.124257 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="proxy-httpd" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.124271 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="sg-core" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.124282 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" containerName="ceilometer-central-agent" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.125707 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.126481 4763 scope.go:117] "RemoveContainer" containerID="bba12f2a86fbc8fec3626434060bfc0dc89ddfbcba9d1f217c2fb95ab958250e" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.127948 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.130427 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.151127 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.166793 4763 scope.go:117] "RemoveContainer" containerID="d13118aa8280628af60ab4d691b7a42671a606e4fc02cc84cb6f438ca243d27a" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.236124 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.236228 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-log-httpd\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.236260 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-scripts\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.239498 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-run-httpd\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.239570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.239607 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnkz2\" (UniqueName: \"kubernetes.io/projected/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-kube-api-access-wnkz2\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.239765 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-config-data\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.342088 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-log-httpd\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.342149 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-scripts\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.342194 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-run-httpd\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.342221 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.342250 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnkz2\" (UniqueName: \"kubernetes.io/projected/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-kube-api-access-wnkz2\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.342881 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-run-httpd\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.342948 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-log-httpd\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.344076 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-config-data\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.344112 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.360350 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-scripts\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.360393 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.360736 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.361203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-config-data\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.364421 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnkz2\" (UniqueName: \"kubernetes.io/projected/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-kube-api-access-wnkz2\") pod \"ceilometer-0\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.448997 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.450504 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.547974 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-config-data\") pod \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.548057 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-combined-ca-bundle\") pod \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.548258 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twk2d\" (UniqueName: \"kubernetes.io/projected/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-kube-api-access-twk2d\") pod \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.548279 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-scripts\") pod \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\" (UID: \"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2\") " Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.552923 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-scripts" (OuterVolumeSpecName: "scripts") pod "2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" (UID: "2da701d9-7efc-4ae7-bcfe-eeed1e7312a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.556295 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-kube-api-access-twk2d" (OuterVolumeSpecName: "kube-api-access-twk2d") pod "2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" (UID: "2da701d9-7efc-4ae7-bcfe-eeed1e7312a2"). InnerVolumeSpecName "kube-api-access-twk2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.594594 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-config-data" (OuterVolumeSpecName: "config-data") pod "2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" (UID: "2da701d9-7efc-4ae7-bcfe-eeed1e7312a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.602787 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" (UID: "2da701d9-7efc-4ae7-bcfe-eeed1e7312a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.650563 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twk2d\" (UniqueName: \"kubernetes.io/projected/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-kube-api-access-twk2d\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.650589 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.650598 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.650607 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:25 crc kubenswrapper[4763]: I1201 09:35:25.922783 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.071694 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerStarted","Data":"cff2d5fa34f7114cab40d5718059b81e6e03cb2f8bce67efdbd9ec4c01d321c1"} Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.073605 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-b92hz" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.073618 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-b92hz" event={"ID":"2da701d9-7efc-4ae7-bcfe-eeed1e7312a2","Type":"ContainerDied","Data":"602f34e636eb50c8a1f0e6401a293f8a615a7824e55ea9bdf4b6c221394d7421"} Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.073640 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="602f34e636eb50c8a1f0e6401a293f8a615a7824e55ea9bdf4b6c221394d7421" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.161000 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 09:35:26 crc kubenswrapper[4763]: E1201 09:35:26.161424 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" containerName="nova-cell0-conductor-db-sync" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.161450 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" containerName="nova-cell0-conductor-db-sync" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.161676 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" containerName="nova-cell0-conductor-db-sync" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.162418 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.164197 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.164669 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rp477" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.178486 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.261382 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b95396-e7dd-4b49-b465-db158816b7ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.261443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b95396-e7dd-4b49-b465-db158816b7ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.261548 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k97nw\" (UniqueName: \"kubernetes.io/projected/a0b95396-e7dd-4b49-b465-db158816b7ea-kube-api-access-k97nw\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.363789 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b95396-e7dd-4b49-b465-db158816b7ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.363860 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b95396-e7dd-4b49-b465-db158816b7ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.363962 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k97nw\" (UniqueName: \"kubernetes.io/projected/a0b95396-e7dd-4b49-b465-db158816b7ea-kube-api-access-k97nw\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.369085 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0b95396-e7dd-4b49-b465-db158816b7ea-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.374284 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0b95396-e7dd-4b49-b465-db158816b7ea-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.384701 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k97nw\" (UniqueName: \"kubernetes.io/projected/a0b95396-e7dd-4b49-b465-db158816b7ea-kube-api-access-k97nw\") pod \"nova-cell0-conductor-0\" (UID: \"a0b95396-e7dd-4b49-b465-db158816b7ea\") " pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.484239 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:26 crc kubenswrapper[4763]: I1201 09:35:26.921093 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 01 09:35:26 crc kubenswrapper[4763]: W1201 09:35:26.921988 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0b95396_e7dd_4b49_b465_db158816b7ea.slice/crio-52f617e97b3edd19bf7850e2e2ce3c54dd93c190ad1f4af36c163edc6d1c1b4a WatchSource:0}: Error finding container 52f617e97b3edd19bf7850e2e2ce3c54dd93c190ad1f4af36c163edc6d1c1b4a: Status 404 returned error can't find the container with id 52f617e97b3edd19bf7850e2e2ce3c54dd93c190ad1f4af36c163edc6d1c1b4a Dec 01 09:35:27 crc kubenswrapper[4763]: I1201 09:35:27.008439 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8014a1f-3a15-46c9-a4a6-f605b575ab64" path="/var/lib/kubelet/pods/a8014a1f-3a15-46c9-a4a6-f605b575ab64/volumes" Dec 01 09:35:27 crc kubenswrapper[4763]: I1201 09:35:27.100085 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerStarted","Data":"54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62"} Dec 01 09:35:27 crc kubenswrapper[4763]: I1201 09:35:27.101948 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a0b95396-e7dd-4b49-b465-db158816b7ea","Type":"ContainerStarted","Data":"52f617e97b3edd19bf7850e2e2ce3c54dd93c190ad1f4af36c163edc6d1c1b4a"} Dec 01 09:35:28 crc kubenswrapper[4763]: I1201 09:35:28.114209 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a0b95396-e7dd-4b49-b465-db158816b7ea","Type":"ContainerStarted","Data":"94f2ace859dc32e17fe06867fa8ad8e6e1ac5ecf4a0ad8b01f7986b25f34dc24"} Dec 01 09:35:28 crc kubenswrapper[4763]: I1201 09:35:28.114981 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:28 crc kubenswrapper[4763]: I1201 09:35:28.138001 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.137975731 podStartE2EDuration="2.137975731s" podCreationTimestamp="2025-12-01 09:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:35:28.128942597 +0000 UTC m=+1245.397591365" watchObservedRunningTime="2025-12-01 09:35:28.137975731 +0000 UTC m=+1245.406624499" Dec 01 09:35:29 crc kubenswrapper[4763]: I1201 09:35:29.124438 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerStarted","Data":"cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b"} Dec 01 09:35:29 crc kubenswrapper[4763]: I1201 09:35:29.124924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerStarted","Data":"94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa"} Dec 01 09:35:32 crc kubenswrapper[4763]: I1201 09:35:32.150225 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerStarted","Data":"1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb"} Dec 01 09:35:32 crc kubenswrapper[4763]: I1201 09:35:32.151112 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:35:32 crc kubenswrapper[4763]: I1201 09:35:32.180661 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.143317254 podStartE2EDuration="7.180642541s" podCreationTimestamp="2025-12-01 09:35:25 +0000 UTC" firstStartedPulling="2025-12-01 09:35:25.933927286 +0000 UTC m=+1243.202576054" lastFinishedPulling="2025-12-01 09:35:30.971252573 +0000 UTC m=+1248.239901341" observedRunningTime="2025-12-01 09:35:32.17775948 +0000 UTC m=+1249.446408268" watchObservedRunningTime="2025-12-01 09:35:32.180642541 +0000 UTC m=+1249.449291309" Dec 01 09:35:33 crc kubenswrapper[4763]: I1201 09:35:33.929835 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:35:33 crc kubenswrapper[4763]: I1201 09:35:33.930364 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:35:33 crc kubenswrapper[4763]: I1201 09:35:33.930424 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:35:33 crc kubenswrapper[4763]: I1201 09:35:33.930999 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdb76d67e51814424a96785e6ed38c02e1e5ea6f161d5d45ba5cfcfc9064da51"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:35:33 crc kubenswrapper[4763]: I1201 09:35:33.931044 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://cdb76d67e51814424a96785e6ed38c02e1e5ea6f161d5d45ba5cfcfc9064da51" gracePeriod=600 Dec 01 09:35:34 crc kubenswrapper[4763]: I1201 09:35:34.169393 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="cdb76d67e51814424a96785e6ed38c02e1e5ea6f161d5d45ba5cfcfc9064da51" exitCode=0 Dec 01 09:35:34 crc kubenswrapper[4763]: I1201 09:35:34.169441 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"cdb76d67e51814424a96785e6ed38c02e1e5ea6f161d5d45ba5cfcfc9064da51"} Dec 01 09:35:34 crc kubenswrapper[4763]: I1201 09:35:34.169511 4763 scope.go:117] "RemoveContainer" containerID="fbcaa44c81e6e848c09eeb8a68cb5f7f03225b440f52ed6609277022adeaf191" Dec 01 09:35:35 crc kubenswrapper[4763]: I1201 09:35:35.179215 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"57d5657ea17b09564dcba7a4e51f73f6b9a810185f0715911e5b25596bc9c73c"} Dec 01 09:35:36 crc kubenswrapper[4763]: I1201 09:35:36.525292 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.009305 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dfdsm"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.010702 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.012597 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.017530 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.026205 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dfdsm"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.157340 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.157578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-config-data\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.157604 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-scripts\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.157784 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj49w\" (UniqueName: \"kubernetes.io/projected/a198f66c-6f38-4b84-b2bd-898f00d40932-kube-api-access-fj49w\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.219741 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.221385 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.223788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.237868 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.268178 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-config-data\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.268237 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-scripts\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.268325 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj49w\" (UniqueName: \"kubernetes.io/projected/a198f66c-6f38-4b84-b2bd-898f00d40932-kube-api-access-fj49w\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.268426 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.286100 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-scripts\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.291420 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.298645 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-config-data\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.309577 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.311350 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.340532 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.359176 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj49w\" (UniqueName: \"kubernetes.io/projected/a198f66c-6f38-4b84-b2bd-898f00d40932-kube-api-access-fj49w\") pod \"nova-cell0-cell-mapping-dfdsm\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.384936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.385207 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqgpd\" (UniqueName: \"kubernetes.io/projected/a777a5f1-1e8f-451c-b320-a35976b3f6ba-kube-api-access-zqgpd\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.385330 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.385353 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a777a5f1-1e8f-451c-b320-a35976b3f6ba-logs\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.385619 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-config-data\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.459083 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.461985 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.489893 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.489937 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqgpd\" (UniqueName: \"kubernetes.io/projected/a777a5f1-1e8f-451c-b320-a35976b3f6ba-kube-api-access-zqgpd\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.489991 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a777a5f1-1e8f-451c-b320-a35976b3f6ba-logs\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.490025 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-config-data\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.490043 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-config-data\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.490059 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc6zn\" (UniqueName: \"kubernetes.io/projected/fa6e2a2f-e100-4666-a248-33fe0feae804-kube-api-access-jc6zn\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.490114 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.492495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a777a5f1-1e8f-451c-b320-a35976b3f6ba-logs\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.492997 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.500358 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-config-data\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.516306 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.519554 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.543867 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqgpd\" (UniqueName: \"kubernetes.io/projected/a777a5f1-1e8f-451c-b320-a35976b3f6ba-kube-api-access-zqgpd\") pod \"nova-api-0\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.562914 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.594340 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-config-data\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.594408 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-config-data\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.594443 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc6zn\" (UniqueName: \"kubernetes.io/projected/fa6e2a2f-e100-4666-a248-33fe0feae804-kube-api-access-jc6zn\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.594561 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.594617 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b6286d-535e-4fb8-81c0-b30b0d9b151a-logs\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.594665 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.594743 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzr6\" (UniqueName: \"kubernetes.io/projected/47b6286d-535e-4fb8-81c0-b30b0d9b151a-kube-api-access-chzr6\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.602075 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-config-data\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.627199 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.630108 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.643781 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.645770 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.646580 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.666905 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc6zn\" (UniqueName: \"kubernetes.io/projected/fa6e2a2f-e100-4666-a248-33fe0feae804-kube-api-access-jc6zn\") pod \"nova-scheduler-0\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.676113 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.696819 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.696900 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b6286d-535e-4fb8-81c0-b30b0d9b151a-logs\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.696985 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chzr6\" (UniqueName: \"kubernetes.io/projected/47b6286d-535e-4fb8-81c0-b30b0d9b151a-kube-api-access-chzr6\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.697028 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-config-data\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.699190 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b6286d-535e-4fb8-81c0-b30b0d9b151a-logs\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.714146 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.718516 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-config-data\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.718910 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.734866 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gzcx2"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.741771 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.767768 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gzcx2"] Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.775135 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzr6\" (UniqueName: \"kubernetes.io/projected/47b6286d-535e-4fb8-81c0-b30b0d9b151a-kube-api-access-chzr6\") pod \"nova-metadata-0\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " pod="openstack/nova-metadata-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.798255 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.798312 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.798384 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kxf8\" (UniqueName: \"kubernetes.io/projected/649aeb71-482b-4683-a727-252060682032-kube-api-access-7kxf8\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.900615 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.901428 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-config\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.901480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.901507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.901568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kxf8\" (UniqueName: \"kubernetes.io/projected/649aeb71-482b-4683-a727-252060682032-kube-api-access-7kxf8\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.901619 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.901658 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kgtd\" (UniqueName: \"kubernetes.io/projected/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-kube-api-access-7kgtd\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.901714 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.912721 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.922184 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.927428 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kxf8\" (UniqueName: \"kubernetes.io/projected/649aeb71-482b-4683-a727-252060682032-kube-api-access-7kxf8\") pod \"nova-cell1-novncproxy-0\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:37 crc kubenswrapper[4763]: I1201 09:35:37.970524 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.006555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.006611 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kgtd\" (UniqueName: \"kubernetes.io/projected/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-kube-api-access-7kgtd\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.007274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.007383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.007421 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-config\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.007915 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.008637 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.008846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-config\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.009147 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.027710 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.039328 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kgtd\" (UniqueName: \"kubernetes.io/projected/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-kube-api-access-7kgtd\") pod \"dnsmasq-dns-8b8cf6657-gzcx2\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.097896 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.362428 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.378685 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:35:38 crc kubenswrapper[4763]: W1201 09:35:38.462051 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda198f66c_6f38_4b84_b2bd_898f00d40932.slice/crio-637753b9fc27a8bc861ee5baac82685900492b38afc8b62bd370b824428bcfad WatchSource:0}: Error finding container 637753b9fc27a8bc861ee5baac82685900492b38afc8b62bd370b824428bcfad: Status 404 returned error can't find the container with id 637753b9fc27a8bc861ee5baac82685900492b38afc8b62bd370b824428bcfad Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.462322 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dfdsm"] Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.595992 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.646340 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.677992 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g9pwc"] Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.679439 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.682340 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.685164 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.708268 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g9pwc"] Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.744272 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gzcx2"] Dec 01 09:35:38 crc kubenswrapper[4763]: W1201 09:35:38.755546 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47b6286d_535e_4fb8_81c0_b30b0d9b151a.slice/crio-47a4146792a91e2e5acaad40f0f0bf4fa148d95f0533ac45ce82bf996c7bcb10 WatchSource:0}: Error finding container 47a4146792a91e2e5acaad40f0f0bf4fa148d95f0533ac45ce82bf996c7bcb10: Status 404 returned error can't find the container with id 47a4146792a91e2e5acaad40f0f0bf4fa148d95f0533ac45ce82bf996c7bcb10 Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.762902 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.826745 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.826803 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-scripts\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.827219 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-config-data\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.827276 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqncn\" (UniqueName: \"kubernetes.io/projected/d0b66291-ad35-4f2b-b21c-da8de9a419d2-kube-api-access-xqncn\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.932761 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-config-data\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.932852 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqncn\" (UniqueName: \"kubernetes.io/projected/d0b66291-ad35-4f2b-b21c-da8de9a419d2-kube-api-access-xqncn\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.932911 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.932937 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-scripts\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.936699 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.936839 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-scripts\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.938716 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-config-data\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:38 crc kubenswrapper[4763]: I1201 09:35:38.947856 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqncn\" (UniqueName: \"kubernetes.io/projected/d0b66291-ad35-4f2b-b21c-da8de9a419d2-kube-api-access-xqncn\") pod \"nova-cell1-conductor-db-sync-g9pwc\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:39 crc kubenswrapper[4763]: I1201 09:35:39.022593 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:39 crc kubenswrapper[4763]: I1201 09:35:39.238998 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"649aeb71-482b-4683-a727-252060682032","Type":"ContainerStarted","Data":"40f7dac49d75dd7b844ac7e8ad449dade62a0c1145aa1e12974bc44790056fe1"} Dec 01 09:35:39 crc kubenswrapper[4763]: I1201 09:35:39.243543 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47b6286d-535e-4fb8-81c0-b30b0d9b151a","Type":"ContainerStarted","Data":"47a4146792a91e2e5acaad40f0f0bf4fa148d95f0533ac45ce82bf996c7bcb10"} Dec 01 09:35:39 crc kubenswrapper[4763]: I1201 09:35:39.245062 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dfdsm" event={"ID":"a198f66c-6f38-4b84-b2bd-898f00d40932","Type":"ContainerStarted","Data":"637753b9fc27a8bc861ee5baac82685900492b38afc8b62bd370b824428bcfad"} Dec 01 09:35:39 crc kubenswrapper[4763]: I1201 09:35:39.246730 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" event={"ID":"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e","Type":"ContainerStarted","Data":"6be91e1bbb2433608453434cd4e1ed5224bf7073bdca26a3f8fbf25bb3d72372"} Dec 01 09:35:39 crc kubenswrapper[4763]: I1201 09:35:39.248003 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a777a5f1-1e8f-451c-b320-a35976b3f6ba","Type":"ContainerStarted","Data":"ea0e7b32807d850b4e41f8487873b736f8938fc9584f4aee566be5e884f4e3ba"} Dec 01 09:35:39 crc kubenswrapper[4763]: I1201 09:35:39.249572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa6e2a2f-e100-4666-a248-33fe0feae804","Type":"ContainerStarted","Data":"8a465fb93bc0b25c38ba880a4dee2495f7d476ef7944f346e3c8b4c6ffbe0eb4"} Dec 01 09:35:39 crc kubenswrapper[4763]: I1201 09:35:39.538105 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g9pwc"] Dec 01 09:35:40 crc kubenswrapper[4763]: I1201 09:35:40.300704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dfdsm" event={"ID":"a198f66c-6f38-4b84-b2bd-898f00d40932","Type":"ContainerStarted","Data":"0c6926e444bcc6c38baa09d6d0b3aaf27db832b3d984b682bbe88da39600f620"} Dec 01 09:35:40 crc kubenswrapper[4763]: I1201 09:35:40.314242 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" containerID="4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df" exitCode=0 Dec 01 09:35:40 crc kubenswrapper[4763]: I1201 09:35:40.314306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" event={"ID":"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e","Type":"ContainerDied","Data":"4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df"} Dec 01 09:35:40 crc kubenswrapper[4763]: I1201 09:35:40.321601 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" event={"ID":"d0b66291-ad35-4f2b-b21c-da8de9a419d2","Type":"ContainerStarted","Data":"fb2b85355691165fcc2f4bc4e6af80c4dc40b760c308d24019ebeb2755f66632"} Dec 01 09:35:40 crc kubenswrapper[4763]: I1201 09:35:40.321650 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" event={"ID":"d0b66291-ad35-4f2b-b21c-da8de9a419d2","Type":"ContainerStarted","Data":"fdb312f3ec0dc814e15c1f7a9fc60249716c1d432c059f29ecb46e7a4113ee66"} Dec 01 09:35:40 crc kubenswrapper[4763]: I1201 09:35:40.325984 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dfdsm" podStartSLOduration=4.325966963 podStartE2EDuration="4.325966963s" podCreationTimestamp="2025-12-01 09:35:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:35:40.32409646 +0000 UTC m=+1257.592745228" watchObservedRunningTime="2025-12-01 09:35:40.325966963 +0000 UTC m=+1257.594615731" Dec 01 09:35:40 crc kubenswrapper[4763]: I1201 09:35:40.359957 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" podStartSLOduration=2.35993432 podStartE2EDuration="2.35993432s" podCreationTimestamp="2025-12-01 09:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:35:40.356765991 +0000 UTC m=+1257.625414759" watchObservedRunningTime="2025-12-01 09:35:40.35993432 +0000 UTC m=+1257.628583088" Dec 01 09:35:41 crc kubenswrapper[4763]: I1201 09:35:41.651910 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:35:41 crc kubenswrapper[4763]: I1201 09:35:41.696508 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.361032 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a777a5f1-1e8f-451c-b320-a35976b3f6ba","Type":"ContainerStarted","Data":"5180a120175bac76203dda520231caf165e3fde010473e7c342a2f780d6ab30c"} Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.361586 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a777a5f1-1e8f-451c-b320-a35976b3f6ba","Type":"ContainerStarted","Data":"7c9be0e2cb682434fa9415bb667b6821db25e9672bf1ebd2017765786491c0bc"} Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.364784 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa6e2a2f-e100-4666-a248-33fe0feae804","Type":"ContainerStarted","Data":"4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe"} Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.366981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"649aeb71-482b-4683-a727-252060682032","Type":"ContainerStarted","Data":"76a45d3100b95b0c0e254da3df01d201dca0ba6fad244c10c1ae0b8b5ff57cb4"} Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.367118 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="649aeb71-482b-4683-a727-252060682032" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://76a45d3100b95b0c0e254da3df01d201dca0ba6fad244c10c1ae0b8b5ff57cb4" gracePeriod=30 Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.371495 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47b6286d-535e-4fb8-81c0-b30b0d9b151a","Type":"ContainerStarted","Data":"650648affd993e2501d41d603948c620e6aa62ffcae2148d65199a5089239139"} Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.371548 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47b6286d-535e-4fb8-81c0-b30b0d9b151a","Type":"ContainerStarted","Data":"b0c1d0110fc1838000b9dec9259d57fdf2c821d3d77dde3a7ed5621cf6fd0bc6"} Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.371672 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerName="nova-metadata-log" containerID="cri-o://b0c1d0110fc1838000b9dec9259d57fdf2c821d3d77dde3a7ed5621cf6fd0bc6" gracePeriod=30 Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.371707 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerName="nova-metadata-metadata" containerID="cri-o://650648affd993e2501d41d603948c620e6aa62ffcae2148d65199a5089239139" gracePeriod=30 Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.386590 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" event={"ID":"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e","Type":"ContainerStarted","Data":"ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8"} Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.387306 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.392651 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.027783641 podStartE2EDuration="6.392634532s" podCreationTimestamp="2025-12-01 09:35:37 +0000 UTC" firstStartedPulling="2025-12-01 09:35:38.37849129 +0000 UTC m=+1255.647140058" lastFinishedPulling="2025-12-01 09:35:42.743342181 +0000 UTC m=+1260.011990949" observedRunningTime="2025-12-01 09:35:43.38476607 +0000 UTC m=+1260.653414838" watchObservedRunningTime="2025-12-01 09:35:43.392634532 +0000 UTC m=+1260.661283300" Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.418869 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.431627534 podStartE2EDuration="6.418852081s" podCreationTimestamp="2025-12-01 09:35:37 +0000 UTC" firstStartedPulling="2025-12-01 09:35:38.757544894 +0000 UTC m=+1256.026193662" lastFinishedPulling="2025-12-01 09:35:42.744769441 +0000 UTC m=+1260.013418209" observedRunningTime="2025-12-01 09:35:43.404724523 +0000 UTC m=+1260.673373291" watchObservedRunningTime="2025-12-01 09:35:43.418852081 +0000 UTC m=+1260.687500849" Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.435857 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.401123705 podStartE2EDuration="6.43583756s" podCreationTimestamp="2025-12-01 09:35:37 +0000 UTC" firstStartedPulling="2025-12-01 09:35:38.709659004 +0000 UTC m=+1255.978307762" lastFinishedPulling="2025-12-01 09:35:42.744372849 +0000 UTC m=+1260.013021617" observedRunningTime="2025-12-01 09:35:43.428904385 +0000 UTC m=+1260.697553163" watchObservedRunningTime="2025-12-01 09:35:43.43583756 +0000 UTC m=+1260.704486318" Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.452971 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.331229583 podStartE2EDuration="6.452952942s" podCreationTimestamp="2025-12-01 09:35:37 +0000 UTC" firstStartedPulling="2025-12-01 09:35:38.623615848 +0000 UTC m=+1255.892264616" lastFinishedPulling="2025-12-01 09:35:42.745339207 +0000 UTC m=+1260.013987975" observedRunningTime="2025-12-01 09:35:43.445308596 +0000 UTC m=+1260.713957364" watchObservedRunningTime="2025-12-01 09:35:43.452952942 +0000 UTC m=+1260.721601710" Dec 01 09:35:43 crc kubenswrapper[4763]: I1201 09:35:43.483243 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" podStartSLOduration=6.483219785 podStartE2EDuration="6.483219785s" podCreationTimestamp="2025-12-01 09:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:35:43.477879695 +0000 UTC m=+1260.746528553" watchObservedRunningTime="2025-12-01 09:35:43.483219785 +0000 UTC m=+1260.751868553" Dec 01 09:35:44 crc kubenswrapper[4763]: I1201 09:35:44.397370 4763 generic.go:334] "Generic (PLEG): container finished" podID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerID="b0c1d0110fc1838000b9dec9259d57fdf2c821d3d77dde3a7ed5621cf6fd0bc6" exitCode=143 Dec 01 09:35:44 crc kubenswrapper[4763]: I1201 09:35:44.397554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47b6286d-535e-4fb8-81c0-b30b0d9b151a","Type":"ContainerDied","Data":"b0c1d0110fc1838000b9dec9259d57fdf2c821d3d77dde3a7ed5621cf6fd0bc6"} Dec 01 09:35:47 crc kubenswrapper[4763]: I1201 09:35:47.564443 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:35:47 crc kubenswrapper[4763]: I1201 09:35:47.564962 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:35:47 crc kubenswrapper[4763]: I1201 09:35:47.719578 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 09:35:47 crc kubenswrapper[4763]: I1201 09:35:47.719637 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 09:35:47 crc kubenswrapper[4763]: I1201 09:35:47.750116 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 09:35:47 crc kubenswrapper[4763]: I1201 09:35:47.972076 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:35:47 crc kubenswrapper[4763]: I1201 09:35:47.972399 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.028435 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.099709 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.228910 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-v6l7k"] Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.229386 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" podUID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" containerName="dnsmasq-dns" containerID="cri-o://3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b" gracePeriod=10 Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.433676 4763 generic.go:334] "Generic (PLEG): container finished" podID="d0b66291-ad35-4f2b-b21c-da8de9a419d2" containerID="fb2b85355691165fcc2f4bc4e6af80c4dc40b760c308d24019ebeb2755f66632" exitCode=0 Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.433763 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" event={"ID":"d0b66291-ad35-4f2b-b21c-da8de9a419d2","Type":"ContainerDied","Data":"fb2b85355691165fcc2f4bc4e6af80c4dc40b760c308d24019ebeb2755f66632"} Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.434986 4763 generic.go:334] "Generic (PLEG): container finished" podID="a198f66c-6f38-4b84-b2bd-898f00d40932" containerID="0c6926e444bcc6c38baa09d6d0b3aaf27db832b3d984b682bbe88da39600f620" exitCode=0 Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.435099 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dfdsm" event={"ID":"a198f66c-6f38-4b84-b2bd-898f00d40932","Type":"ContainerDied","Data":"0c6926e444bcc6c38baa09d6d0b3aaf27db832b3d984b682bbe88da39600f620"} Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.543435 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.657315 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.169:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:35:48 crc kubenswrapper[4763]: I1201 09:35:48.657535 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.169:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.417140 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.443722 4763 generic.go:334] "Generic (PLEG): container finished" podID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" containerID="3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b" exitCode=0 Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.443809 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.443866 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" event={"ID":"6c4d1d3d-a555-47d6-ac42-29c65c3c0559","Type":"ContainerDied","Data":"3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b"} Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.443904 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-v6l7k" event={"ID":"6c4d1d3d-a555-47d6-ac42-29c65c3c0559","Type":"ContainerDied","Data":"819960ded2e5ff3fd65dc320355dc6140e1167eed86a19a76120e7606e3cdf22"} Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.443927 4763 scope.go:117] "RemoveContainer" containerID="3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.446604 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-dns-svc\") pod \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.446653 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-sb\") pod \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.446761 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-nb\") pod \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.446803 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljl4x\" (UniqueName: \"kubernetes.io/projected/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-kube-api-access-ljl4x\") pod \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.446847 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-config\") pod \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\" (UID: \"6c4d1d3d-a555-47d6-ac42-29c65c3c0559\") " Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.477190 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-kube-api-access-ljl4x" (OuterVolumeSpecName: "kube-api-access-ljl4x") pod "6c4d1d3d-a555-47d6-ac42-29c65c3c0559" (UID: "6c4d1d3d-a555-47d6-ac42-29c65c3c0559"). InnerVolumeSpecName "kube-api-access-ljl4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.500520 4763 scope.go:117] "RemoveContainer" containerID="a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.550144 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-config" (OuterVolumeSpecName: "config") pod "6c4d1d3d-a555-47d6-ac42-29c65c3c0559" (UID: "6c4d1d3d-a555-47d6-ac42-29c65c3c0559"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.557763 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.557859 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljl4x\" (UniqueName: \"kubernetes.io/projected/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-kube-api-access-ljl4x\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.560013 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c4d1d3d-a555-47d6-ac42-29c65c3c0559" (UID: "6c4d1d3d-a555-47d6-ac42-29c65c3c0559"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.573131 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c4d1d3d-a555-47d6-ac42-29c65c3c0559" (UID: "6c4d1d3d-a555-47d6-ac42-29c65c3c0559"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.579882 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c4d1d3d-a555-47d6-ac42-29c65c3c0559" (UID: "6c4d1d3d-a555-47d6-ac42-29c65c3c0559"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.659084 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.659130 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.659140 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c4d1d3d-a555-47d6-ac42-29c65c3c0559-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.678953 4763 scope.go:117] "RemoveContainer" containerID="3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b" Dec 01 09:35:49 crc kubenswrapper[4763]: E1201 09:35:49.682386 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b\": container with ID starting with 3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b not found: ID does not exist" containerID="3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.682418 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b"} err="failed to get container status \"3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b\": rpc error: code = NotFound desc = could not find container \"3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b\": container with ID starting with 3587cd943c13eefb94fde633bb7dfad5bae6a60fcf7e76052e9d53ff51ea0f2b not found: ID does not exist" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.682438 4763 scope.go:117] "RemoveContainer" containerID="a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934" Dec 01 09:35:49 crc kubenswrapper[4763]: E1201 09:35:49.687666 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934\": container with ID starting with a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934 not found: ID does not exist" containerID="a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.687745 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934"} err="failed to get container status \"a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934\": rpc error: code = NotFound desc = could not find container \"a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934\": container with ID starting with a825cce54b0e9542fb62546b8c6fab9d1e81c6db83bec4d90e8074bd3432f934 not found: ID does not exist" Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.797529 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-v6l7k"] Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.823129 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-v6l7k"] Dec 01 09:35:49 crc kubenswrapper[4763]: I1201 09:35:49.944782 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.044914 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.069424 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqncn\" (UniqueName: \"kubernetes.io/projected/d0b66291-ad35-4f2b-b21c-da8de9a419d2-kube-api-access-xqncn\") pod \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.069532 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-scripts\") pod \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.069665 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-config-data\") pod \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.069743 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-combined-ca-bundle\") pod \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\" (UID: \"d0b66291-ad35-4f2b-b21c-da8de9a419d2\") " Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.074520 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-scripts" (OuterVolumeSpecName: "scripts") pod "d0b66291-ad35-4f2b-b21c-da8de9a419d2" (UID: "d0b66291-ad35-4f2b-b21c-da8de9a419d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.078706 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b66291-ad35-4f2b-b21c-da8de9a419d2-kube-api-access-xqncn" (OuterVolumeSpecName: "kube-api-access-xqncn") pod "d0b66291-ad35-4f2b-b21c-da8de9a419d2" (UID: "d0b66291-ad35-4f2b-b21c-da8de9a419d2"). InnerVolumeSpecName "kube-api-access-xqncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.099761 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-config-data" (OuterVolumeSpecName: "config-data") pod "d0b66291-ad35-4f2b-b21c-da8de9a419d2" (UID: "d0b66291-ad35-4f2b-b21c-da8de9a419d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.115272 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0b66291-ad35-4f2b-b21c-da8de9a419d2" (UID: "d0b66291-ad35-4f2b-b21c-da8de9a419d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.173046 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-combined-ca-bundle\") pod \"a198f66c-6f38-4b84-b2bd-898f00d40932\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.173119 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-scripts\") pod \"a198f66c-6f38-4b84-b2bd-898f00d40932\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.173184 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj49w\" (UniqueName: \"kubernetes.io/projected/a198f66c-6f38-4b84-b2bd-898f00d40932-kube-api-access-fj49w\") pod \"a198f66c-6f38-4b84-b2bd-898f00d40932\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.173413 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-config-data\") pod \"a198f66c-6f38-4b84-b2bd-898f00d40932\" (UID: \"a198f66c-6f38-4b84-b2bd-898f00d40932\") " Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.173954 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.173968 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.173977 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b66291-ad35-4f2b-b21c-da8de9a419d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.173987 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqncn\" (UniqueName: \"kubernetes.io/projected/d0b66291-ad35-4f2b-b21c-da8de9a419d2-kube-api-access-xqncn\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.175957 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-scripts" (OuterVolumeSpecName: "scripts") pod "a198f66c-6f38-4b84-b2bd-898f00d40932" (UID: "a198f66c-6f38-4b84-b2bd-898f00d40932"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.176174 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a198f66c-6f38-4b84-b2bd-898f00d40932-kube-api-access-fj49w" (OuterVolumeSpecName: "kube-api-access-fj49w") pod "a198f66c-6f38-4b84-b2bd-898f00d40932" (UID: "a198f66c-6f38-4b84-b2bd-898f00d40932"). InnerVolumeSpecName "kube-api-access-fj49w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.201678 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a198f66c-6f38-4b84-b2bd-898f00d40932" (UID: "a198f66c-6f38-4b84-b2bd-898f00d40932"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.204876 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-config-data" (OuterVolumeSpecName: "config-data") pod "a198f66c-6f38-4b84-b2bd-898f00d40932" (UID: "a198f66c-6f38-4b84-b2bd-898f00d40932"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.275329 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.275359 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.275368 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a198f66c-6f38-4b84-b2bd-898f00d40932-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.275376 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj49w\" (UniqueName: \"kubernetes.io/projected/a198f66c-6f38-4b84-b2bd-898f00d40932-kube-api-access-fj49w\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.454982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" event={"ID":"d0b66291-ad35-4f2b-b21c-da8de9a419d2","Type":"ContainerDied","Data":"fdb312f3ec0dc814e15c1f7a9fc60249716c1d432c059f29ecb46e7a4113ee66"} Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.455024 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdb312f3ec0dc814e15c1f7a9fc60249716c1d432c059f29ecb46e7a4113ee66" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.455045 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-g9pwc" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.456609 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dfdsm" event={"ID":"a198f66c-6f38-4b84-b2bd-898f00d40932","Type":"ContainerDied","Data":"637753b9fc27a8bc861ee5baac82685900492b38afc8b62bd370b824428bcfad"} Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.456712 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="637753b9fc27a8bc861ee5baac82685900492b38afc8b62bd370b824428bcfad" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.456673 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dfdsm" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.587524 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 09:35:50 crc kubenswrapper[4763]: E1201 09:35:50.587867 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b66291-ad35-4f2b-b21c-da8de9a419d2" containerName="nova-cell1-conductor-db-sync" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.587882 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b66291-ad35-4f2b-b21c-da8de9a419d2" containerName="nova-cell1-conductor-db-sync" Dec 01 09:35:50 crc kubenswrapper[4763]: E1201 09:35:50.587901 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" containerName="init" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.587908 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" containerName="init" Dec 01 09:35:50 crc kubenswrapper[4763]: E1201 09:35:50.587922 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a198f66c-6f38-4b84-b2bd-898f00d40932" containerName="nova-manage" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.587928 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a198f66c-6f38-4b84-b2bd-898f00d40932" containerName="nova-manage" Dec 01 09:35:50 crc kubenswrapper[4763]: E1201 09:35:50.587941 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" containerName="dnsmasq-dns" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.587946 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" containerName="dnsmasq-dns" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.588096 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b66291-ad35-4f2b-b21c-da8de9a419d2" containerName="nova-cell1-conductor-db-sync" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.588110 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" containerName="dnsmasq-dns" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.588125 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a198f66c-6f38-4b84-b2bd-898f00d40932" containerName="nova-manage" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.588664 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.590445 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.605979 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.681331 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004fbb5-ee5c-4328-bb1f-9054e4224138-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.681544 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004fbb5-ee5c-4328-bb1f-9054e4224138-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.681739 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btvlm\" (UniqueName: \"kubernetes.io/projected/1004fbb5-ee5c-4328-bb1f-9054e4224138-kube-api-access-btvlm\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.714768 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.715009 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-log" containerID="cri-o://7c9be0e2cb682434fa9415bb667b6821db25e9672bf1ebd2017765786491c0bc" gracePeriod=30 Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.715136 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-api" containerID="cri-o://5180a120175bac76203dda520231caf165e3fde010473e7c342a2f780d6ab30c" gracePeriod=30 Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.726989 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.727231 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fa6e2a2f-e100-4666-a248-33fe0feae804" containerName="nova-scheduler-scheduler" containerID="cri-o://4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe" gracePeriod=30 Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.783761 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btvlm\" (UniqueName: \"kubernetes.io/projected/1004fbb5-ee5c-4328-bb1f-9054e4224138-kube-api-access-btvlm\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.783884 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004fbb5-ee5c-4328-bb1f-9054e4224138-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.783960 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004fbb5-ee5c-4328-bb1f-9054e4224138-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.787297 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1004fbb5-ee5c-4328-bb1f-9054e4224138-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.801218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1004fbb5-ee5c-4328-bb1f-9054e4224138-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.801559 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btvlm\" (UniqueName: \"kubernetes.io/projected/1004fbb5-ee5c-4328-bb1f-9054e4224138-kube-api-access-btvlm\") pod \"nova-cell1-conductor-0\" (UID: \"1004fbb5-ee5c-4328-bb1f-9054e4224138\") " pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:50 crc kubenswrapper[4763]: I1201 09:35:50.908181 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:51 crc kubenswrapper[4763]: I1201 09:35:51.021399 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4d1d3d-a555-47d6-ac42-29c65c3c0559" path="/var/lib/kubelet/pods/6c4d1d3d-a555-47d6-ac42-29c65c3c0559/volumes" Dec 01 09:35:51 crc kubenswrapper[4763]: I1201 09:35:51.467378 4763 generic.go:334] "Generic (PLEG): container finished" podID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerID="7c9be0e2cb682434fa9415bb667b6821db25e9672bf1ebd2017765786491c0bc" exitCode=143 Dec 01 09:35:51 crc kubenswrapper[4763]: I1201 09:35:51.467468 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a777a5f1-1e8f-451c-b320-a35976b3f6ba","Type":"ContainerDied","Data":"7c9be0e2cb682434fa9415bb667b6821db25e9672bf1ebd2017765786491c0bc"} Dec 01 09:35:51 crc kubenswrapper[4763]: W1201 09:35:51.565222 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1004fbb5_ee5c_4328_bb1f_9054e4224138.slice/crio-267e45342b3159e1dbee3a803f457cd2b10719263900f082b2e63f9ed464bb21 WatchSource:0}: Error finding container 267e45342b3159e1dbee3a803f457cd2b10719263900f082b2e63f9ed464bb21: Status 404 returned error can't find the container with id 267e45342b3159e1dbee3a803f457cd2b10719263900f082b2e63f9ed464bb21 Dec 01 09:35:51 crc kubenswrapper[4763]: I1201 09:35:51.572318 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 01 09:35:52 crc kubenswrapper[4763]: I1201 09:35:52.480948 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1004fbb5-ee5c-4328-bb1f-9054e4224138","Type":"ContainerStarted","Data":"72c3178b02e14950d67671640333a8218134fa6248395a2045f2abbceb498011"} Dec 01 09:35:52 crc kubenswrapper[4763]: I1201 09:35:52.481277 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 01 09:35:52 crc kubenswrapper[4763]: I1201 09:35:52.481289 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1004fbb5-ee5c-4328-bb1f-9054e4224138","Type":"ContainerStarted","Data":"267e45342b3159e1dbee3a803f457cd2b10719263900f082b2e63f9ed464bb21"} Dec 01 09:35:52 crc kubenswrapper[4763]: E1201 09:35:52.725564 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:35:52 crc kubenswrapper[4763]: E1201 09:35:52.728979 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:35:52 crc kubenswrapper[4763]: E1201 09:35:52.730465 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:35:52 crc kubenswrapper[4763]: E1201 09:35:52.730500 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fa6e2a2f-e100-4666-a248-33fe0feae804" containerName="nova-scheduler-scheduler" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.312209 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.331392 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=5.331374927 podStartE2EDuration="5.331374927s" podCreationTimestamp="2025-12-01 09:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:35:52.501260695 +0000 UTC m=+1269.769909453" watchObservedRunningTime="2025-12-01 09:35:55.331374927 +0000 UTC m=+1272.600023695" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.379464 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-config-data\") pod \"fa6e2a2f-e100-4666-a248-33fe0feae804\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.379891 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-combined-ca-bundle\") pod \"fa6e2a2f-e100-4666-a248-33fe0feae804\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.379956 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc6zn\" (UniqueName: \"kubernetes.io/projected/fa6e2a2f-e100-4666-a248-33fe0feae804-kube-api-access-jc6zn\") pod \"fa6e2a2f-e100-4666-a248-33fe0feae804\" (UID: \"fa6e2a2f-e100-4666-a248-33fe0feae804\") " Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.385124 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6e2a2f-e100-4666-a248-33fe0feae804-kube-api-access-jc6zn" (OuterVolumeSpecName: "kube-api-access-jc6zn") pod "fa6e2a2f-e100-4666-a248-33fe0feae804" (UID: "fa6e2a2f-e100-4666-a248-33fe0feae804"). InnerVolumeSpecName "kube-api-access-jc6zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.415515 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-config-data" (OuterVolumeSpecName: "config-data") pod "fa6e2a2f-e100-4666-a248-33fe0feae804" (UID: "fa6e2a2f-e100-4666-a248-33fe0feae804"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.426565 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa6e2a2f-e100-4666-a248-33fe0feae804" (UID: "fa6e2a2f-e100-4666-a248-33fe0feae804"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.477102 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.483035 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.483062 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc6zn\" (UniqueName: \"kubernetes.io/projected/fa6e2a2f-e100-4666-a248-33fe0feae804-kube-api-access-jc6zn\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.483074 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6e2a2f-e100-4666-a248-33fe0feae804-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.538593 4763 generic.go:334] "Generic (PLEG): container finished" podID="fa6e2a2f-e100-4666-a248-33fe0feae804" containerID="4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe" exitCode=0 Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.538713 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.539108 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa6e2a2f-e100-4666-a248-33fe0feae804","Type":"ContainerDied","Data":"4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe"} Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.539187 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa6e2a2f-e100-4666-a248-33fe0feae804","Type":"ContainerDied","Data":"8a465fb93bc0b25c38ba880a4dee2495f7d476ef7944f346e3c8b4c6ffbe0eb4"} Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.539235 4763 scope.go:117] "RemoveContainer" containerID="4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.545607 4763 generic.go:334] "Generic (PLEG): container finished" podID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerID="5180a120175bac76203dda520231caf165e3fde010473e7c342a2f780d6ab30c" exitCode=0 Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.545640 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a777a5f1-1e8f-451c-b320-a35976b3f6ba","Type":"ContainerDied","Data":"5180a120175bac76203dda520231caf165e3fde010473e7c342a2f780d6ab30c"} Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.608371 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.617064 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.636717 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:55 crc kubenswrapper[4763]: E1201 09:35:55.637256 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6e2a2f-e100-4666-a248-33fe0feae804" containerName="nova-scheduler-scheduler" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.637279 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6e2a2f-e100-4666-a248-33fe0feae804" containerName="nova-scheduler-scheduler" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.637527 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6e2a2f-e100-4666-a248-33fe0feae804" containerName="nova-scheduler-scheduler" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.638259 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.643671 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.657062 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.679172 4763 scope.go:117] "RemoveContainer" containerID="4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe" Dec 01 09:35:55 crc kubenswrapper[4763]: E1201 09:35:55.679828 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe\": container with ID starting with 4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe not found: ID does not exist" containerID="4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.679857 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe"} err="failed to get container status \"4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe\": rpc error: code = NotFound desc = could not find container \"4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe\": container with ID starting with 4451aa0f9ea8bebc214078f90e2a9c6efb51b7205ef99f243dc2444060ad81fe not found: ID does not exist" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.687973 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj2gn\" (UniqueName: \"kubernetes.io/projected/fb4ca475-898b-4bb6-a249-8bc297417775-kube-api-access-kj2gn\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.688104 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-config-data\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.688266 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.692207 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.789448 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a777a5f1-1e8f-451c-b320-a35976b3f6ba-logs\") pod \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.789634 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqgpd\" (UniqueName: \"kubernetes.io/projected/a777a5f1-1e8f-451c-b320-a35976b3f6ba-kube-api-access-zqgpd\") pod \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.789673 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-combined-ca-bundle\") pod \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.789735 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-config-data\") pod \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\" (UID: \"a777a5f1-1e8f-451c-b320-a35976b3f6ba\") " Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.790076 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj2gn\" (UniqueName: \"kubernetes.io/projected/fb4ca475-898b-4bb6-a249-8bc297417775-kube-api-access-kj2gn\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.790147 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-config-data\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.790240 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.791941 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a777a5f1-1e8f-451c-b320-a35976b3f6ba-logs" (OuterVolumeSpecName: "logs") pod "a777a5f1-1e8f-451c-b320-a35976b3f6ba" (UID: "a777a5f1-1e8f-451c-b320-a35976b3f6ba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.795892 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.798403 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-config-data\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.802364 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a777a5f1-1e8f-451c-b320-a35976b3f6ba-kube-api-access-zqgpd" (OuterVolumeSpecName: "kube-api-access-zqgpd") pod "a777a5f1-1e8f-451c-b320-a35976b3f6ba" (UID: "a777a5f1-1e8f-451c-b320-a35976b3f6ba"). InnerVolumeSpecName "kube-api-access-zqgpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.827121 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj2gn\" (UniqueName: \"kubernetes.io/projected/fb4ca475-898b-4bb6-a249-8bc297417775-kube-api-access-kj2gn\") pod \"nova-scheduler-0\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " pod="openstack/nova-scheduler-0" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.847669 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a777a5f1-1e8f-451c-b320-a35976b3f6ba" (UID: "a777a5f1-1e8f-451c-b320-a35976b3f6ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.849642 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-config-data" (OuterVolumeSpecName: "config-data") pod "a777a5f1-1e8f-451c-b320-a35976b3f6ba" (UID: "a777a5f1-1e8f-451c-b320-a35976b3f6ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.891425 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a777a5f1-1e8f-451c-b320-a35976b3f6ba-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.891467 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqgpd\" (UniqueName: \"kubernetes.io/projected/a777a5f1-1e8f-451c-b320-a35976b3f6ba-kube-api-access-zqgpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.891478 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.891486 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a777a5f1-1e8f-451c-b320-a35976b3f6ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:35:55 crc kubenswrapper[4763]: I1201 09:35:55.954007 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.511321 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.560244 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a777a5f1-1e8f-451c-b320-a35976b3f6ba","Type":"ContainerDied","Data":"ea0e7b32807d850b4e41f8487873b736f8938fc9584f4aee566be5e884f4e3ba"} Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.560325 4763 scope.go:117] "RemoveContainer" containerID="5180a120175bac76203dda520231caf165e3fde010473e7c342a2f780d6ab30c" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.560580 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.562905 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb4ca475-898b-4bb6-a249-8bc297417775","Type":"ContainerStarted","Data":"8669eb82c395c4eed8064285231f37ab41de513b095853d170c2391fda101096"} Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.607676 4763 scope.go:117] "RemoveContainer" containerID="7c9be0e2cb682434fa9415bb667b6821db25e9672bf1ebd2017765786491c0bc" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.616989 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.629929 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.664813 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:56 crc kubenswrapper[4763]: E1201 09:35:56.665253 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-api" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.665267 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-api" Dec 01 09:35:56 crc kubenswrapper[4763]: E1201 09:35:56.665280 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-log" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.665285 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-log" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.665481 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-api" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.665494 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" containerName="nova-api-log" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.666435 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.670972 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.683884 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.813231 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.813347 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-config-data\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.813369 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85rm\" (UniqueName: \"kubernetes.io/projected/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-kube-api-access-d85rm\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.813423 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-logs\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.915483 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.915659 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-config-data\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.915688 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d85rm\" (UniqueName: \"kubernetes.io/projected/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-kube-api-access-d85rm\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.916310 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-logs\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.916778 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-logs\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.919693 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.927149 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-config-data\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.942638 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85rm\" (UniqueName: \"kubernetes.io/projected/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-kube-api-access-d85rm\") pod \"nova-api-0\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " pod="openstack/nova-api-0" Dec 01 09:35:56 crc kubenswrapper[4763]: I1201 09:35:56.983537 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:35:57 crc kubenswrapper[4763]: I1201 09:35:57.011448 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a777a5f1-1e8f-451c-b320-a35976b3f6ba" path="/var/lib/kubelet/pods/a777a5f1-1e8f-451c-b320-a35976b3f6ba/volumes" Dec 01 09:35:57 crc kubenswrapper[4763]: I1201 09:35:57.012055 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6e2a2f-e100-4666-a248-33fe0feae804" path="/var/lib/kubelet/pods/fa6e2a2f-e100-4666-a248-33fe0feae804/volumes" Dec 01 09:35:57 crc kubenswrapper[4763]: I1201 09:35:57.548860 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:35:57 crc kubenswrapper[4763]: W1201 09:35:57.557071 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3687bcb9_8c1d_4f52_bc92_237681b9c5ed.slice/crio-6800ce83cc77aabe59c418bd6709bdd3c4c0a70d7573c2d8cc288071115a726a WatchSource:0}: Error finding container 6800ce83cc77aabe59c418bd6709bdd3c4c0a70d7573c2d8cc288071115a726a: Status 404 returned error can't find the container with id 6800ce83cc77aabe59c418bd6709bdd3c4c0a70d7573c2d8cc288071115a726a Dec 01 09:35:57 crc kubenswrapper[4763]: I1201 09:35:57.585143 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb4ca475-898b-4bb6-a249-8bc297417775","Type":"ContainerStarted","Data":"8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c"} Dec 01 09:35:57 crc kubenswrapper[4763]: I1201 09:35:57.586939 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3687bcb9-8c1d-4f52-bc92-237681b9c5ed","Type":"ContainerStarted","Data":"6800ce83cc77aabe59c418bd6709bdd3c4c0a70d7573c2d8cc288071115a726a"} Dec 01 09:35:58 crc kubenswrapper[4763]: I1201 09:35:58.598927 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3687bcb9-8c1d-4f52-bc92-237681b9c5ed","Type":"ContainerStarted","Data":"b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81"} Dec 01 09:35:58 crc kubenswrapper[4763]: I1201 09:35:58.599271 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3687bcb9-8c1d-4f52-bc92-237681b9c5ed","Type":"ContainerStarted","Data":"a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853"} Dec 01 09:35:58 crc kubenswrapper[4763]: I1201 09:35:58.621568 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.621546847 podStartE2EDuration="3.621546847s" podCreationTimestamp="2025-12-01 09:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:35:57.606762943 +0000 UTC m=+1274.875411711" watchObservedRunningTime="2025-12-01 09:35:58.621546847 +0000 UTC m=+1275.890195625" Dec 01 09:35:58 crc kubenswrapper[4763]: I1201 09:35:58.624903 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.624889431 podStartE2EDuration="2.624889431s" podCreationTimestamp="2025-12-01 09:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:35:58.616810733 +0000 UTC m=+1275.885459521" watchObservedRunningTime="2025-12-01 09:35:58.624889431 +0000 UTC m=+1275.893538199" Dec 01 09:35:58 crc kubenswrapper[4763]: I1201 09:35:58.761012 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:35:58 crc kubenswrapper[4763]: I1201 09:35:58.761256 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f606a872-f73e-4245-bf9f-75b4a90dc12f" containerName="kube-state-metrics" containerID="cri-o://0b063f6419fc4006e9f47667888a26e42b268010b8e140dbe89949348e770766" gracePeriod=30 Dec 01 09:35:58 crc kubenswrapper[4763]: I1201 09:35:58.971206 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="f606a872-f73e-4245-bf9f-75b4a90dc12f" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": dial tcp 10.217.0.104:8081: connect: connection refused" Dec 01 09:35:59 crc kubenswrapper[4763]: I1201 09:35:59.615397 4763 generic.go:334] "Generic (PLEG): container finished" podID="f606a872-f73e-4245-bf9f-75b4a90dc12f" containerID="0b063f6419fc4006e9f47667888a26e42b268010b8e140dbe89949348e770766" exitCode=2 Dec 01 09:35:59 crc kubenswrapper[4763]: I1201 09:35:59.615529 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f606a872-f73e-4245-bf9f-75b4a90dc12f","Type":"ContainerDied","Data":"0b063f6419fc4006e9f47667888a26e42b268010b8e140dbe89949348e770766"} Dec 01 09:35:59 crc kubenswrapper[4763]: I1201 09:35:59.748520 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:35:59 crc kubenswrapper[4763]: I1201 09:35:59.781886 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfsvs\" (UniqueName: \"kubernetes.io/projected/f606a872-f73e-4245-bf9f-75b4a90dc12f-kube-api-access-mfsvs\") pod \"f606a872-f73e-4245-bf9f-75b4a90dc12f\" (UID: \"f606a872-f73e-4245-bf9f-75b4a90dc12f\") " Dec 01 09:35:59 crc kubenswrapper[4763]: I1201 09:35:59.837972 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f606a872-f73e-4245-bf9f-75b4a90dc12f-kube-api-access-mfsvs" (OuterVolumeSpecName: "kube-api-access-mfsvs") pod "f606a872-f73e-4245-bf9f-75b4a90dc12f" (UID: "f606a872-f73e-4245-bf9f-75b4a90dc12f"). InnerVolumeSpecName "kube-api-access-mfsvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:35:59 crc kubenswrapper[4763]: I1201 09:35:59.888198 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfsvs\" (UniqueName: \"kubernetes.io/projected/f606a872-f73e-4245-bf9f-75b4a90dc12f-kube-api-access-mfsvs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.360903 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.361300 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="ceilometer-central-agent" containerID="cri-o://54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62" gracePeriod=30 Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.361334 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="sg-core" containerID="cri-o://cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b" gracePeriod=30 Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.361334 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="ceilometer-notification-agent" containerID="cri-o://94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa" gracePeriod=30 Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.361912 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="proxy-httpd" containerID="cri-o://1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb" gracePeriod=30 Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.632662 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f606a872-f73e-4245-bf9f-75b4a90dc12f","Type":"ContainerDied","Data":"c3901279aac015a6e0ba1e0587b0925c4d9be796a08a97eeaad92a010cc476f3"} Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.632733 4763 scope.go:117] "RemoveContainer" containerID="0b063f6419fc4006e9f47667888a26e42b268010b8e140dbe89949348e770766" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.633668 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.650760 4763 generic.go:334] "Generic (PLEG): container finished" podID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerID="1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb" exitCode=0 Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.650799 4763 generic.go:334] "Generic (PLEG): container finished" podID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerID="cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b" exitCode=2 Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.650823 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerDied","Data":"1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb"} Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.650851 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerDied","Data":"cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b"} Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.690540 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.704159 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.719404 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:36:00 crc kubenswrapper[4763]: E1201 09:36:00.719889 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f606a872-f73e-4245-bf9f-75b4a90dc12f" containerName="kube-state-metrics" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.719915 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f606a872-f73e-4245-bf9f-75b4a90dc12f" containerName="kube-state-metrics" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.720141 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f606a872-f73e-4245-bf9f-75b4a90dc12f" containerName="kube-state-metrics" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.720925 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.723039 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.727762 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.730138 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.811748 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k85j\" (UniqueName: \"kubernetes.io/projected/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-api-access-9k85j\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.811796 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.811908 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.811984 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.913209 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.913583 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k85j\" (UniqueName: \"kubernetes.io/projected/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-api-access-9k85j\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.913611 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.913733 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.916979 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.917400 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.919680 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22ccd9e5-e651-4111-adfa-853c6d838d96-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.932029 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k85j\" (UniqueName: \"kubernetes.io/projected/22ccd9e5-e651-4111-adfa-853c6d838d96-kube-api-access-9k85j\") pod \"kube-state-metrics-0\" (UID: \"22ccd9e5-e651-4111-adfa-853c6d838d96\") " pod="openstack/kube-state-metrics-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.954200 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 09:36:00 crc kubenswrapper[4763]: I1201 09:36:00.991490 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 01 09:36:01 crc kubenswrapper[4763]: I1201 09:36:01.006289 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f606a872-f73e-4245-bf9f-75b4a90dc12f" path="/var/lib/kubelet/pods/f606a872-f73e-4245-bf9f-75b4a90dc12f/volumes" Dec 01 09:36:01 crc kubenswrapper[4763]: I1201 09:36:01.039155 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 01 09:36:01 crc kubenswrapper[4763]: I1201 09:36:01.536825 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 01 09:36:01 crc kubenswrapper[4763]: W1201 09:36:01.538151 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22ccd9e5_e651_4111_adfa_853c6d838d96.slice/crio-bf8f2257459404a7943d011abc5dfea640bd66d366bd2428269c8e2fc3f926ea WatchSource:0}: Error finding container bf8f2257459404a7943d011abc5dfea640bd66d366bd2428269c8e2fc3f926ea: Status 404 returned error can't find the container with id bf8f2257459404a7943d011abc5dfea640bd66d366bd2428269c8e2fc3f926ea Dec 01 09:36:01 crc kubenswrapper[4763]: I1201 09:36:01.662615 4763 generic.go:334] "Generic (PLEG): container finished" podID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerID="54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62" exitCode=0 Dec 01 09:36:01 crc kubenswrapper[4763]: I1201 09:36:01.662696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerDied","Data":"54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62"} Dec 01 09:36:01 crc kubenswrapper[4763]: I1201 09:36:01.666280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"22ccd9e5-e651-4111-adfa-853c6d838d96","Type":"ContainerStarted","Data":"bf8f2257459404a7943d011abc5dfea640bd66d366bd2428269c8e2fc3f926ea"} Dec 01 09:36:02 crc kubenswrapper[4763]: I1201 09:36:02.679395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"22ccd9e5-e651-4111-adfa-853c6d838d96","Type":"ContainerStarted","Data":"55ac9bbeccdcd847dfa31fb007e31659d77748125773943bde1997eb4003d3be"} Dec 01 09:36:02 crc kubenswrapper[4763]: I1201 09:36:02.679977 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 01 09:36:02 crc kubenswrapper[4763]: I1201 09:36:02.696091 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.213060579 podStartE2EDuration="2.696074314s" podCreationTimestamp="2025-12-01 09:36:00 +0000 UTC" firstStartedPulling="2025-12-01 09:36:01.541317665 +0000 UTC m=+1278.809966433" lastFinishedPulling="2025-12-01 09:36:02.0243314 +0000 UTC m=+1279.292980168" observedRunningTime="2025-12-01 09:36:02.694291274 +0000 UTC m=+1279.962940052" watchObservedRunningTime="2025-12-01 09:36:02.696074314 +0000 UTC m=+1279.964723082" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.293663 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.420875 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-config-data\") pod \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.420933 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-log-httpd\") pod \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.421005 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-sg-core-conf-yaml\") pod \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.421064 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnkz2\" (UniqueName: \"kubernetes.io/projected/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-kube-api-access-wnkz2\") pod \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.421092 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-run-httpd\") pod \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.421122 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-scripts\") pod \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.421270 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-combined-ca-bundle\") pod \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\" (UID: \"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5\") " Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.421795 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" (UID: "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.422098 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.425151 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" (UID: "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.426832 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-kube-api-access-wnkz2" (OuterVolumeSpecName: "kube-api-access-wnkz2") pod "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" (UID: "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5"). InnerVolumeSpecName "kube-api-access-wnkz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.427263 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-scripts" (OuterVolumeSpecName: "scripts") pod "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" (UID: "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.452626 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" (UID: "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.496661 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" (UID: "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.523694 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnkz2\" (UniqueName: \"kubernetes.io/projected/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-kube-api-access-wnkz2\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.523731 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.523741 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.523750 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.523760 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.541732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-config-data" (OuterVolumeSpecName: "config-data") pod "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" (UID: "e89a1e87-32e4-48e8-8a9e-02b7e50f81d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.625608 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.707069 4763 generic.go:334] "Generic (PLEG): container finished" podID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerID="94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa" exitCode=0 Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.707159 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.707180 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerDied","Data":"94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa"} Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.707493 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e89a1e87-32e4-48e8-8a9e-02b7e50f81d5","Type":"ContainerDied","Data":"cff2d5fa34f7114cab40d5718059b81e6e03cb2f8bce67efdbd9ec4c01d321c1"} Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.707515 4763 scope.go:117] "RemoveContainer" containerID="1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.739024 4763 scope.go:117] "RemoveContainer" containerID="cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.763686 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.787138 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.787694 4763 scope.go:117] "RemoveContainer" containerID="94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.801413 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:05 crc kubenswrapper[4763]: E1201 09:36:05.801824 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="sg-core" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.801841 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="sg-core" Dec 01 09:36:05 crc kubenswrapper[4763]: E1201 09:36:05.801864 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="proxy-httpd" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.801870 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="proxy-httpd" Dec 01 09:36:05 crc kubenswrapper[4763]: E1201 09:36:05.801885 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="ceilometer-central-agent" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.801890 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="ceilometer-central-agent" Dec 01 09:36:05 crc kubenswrapper[4763]: E1201 09:36:05.801901 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="ceilometer-notification-agent" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.801907 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="ceilometer-notification-agent" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.802116 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="ceilometer-central-agent" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.802133 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="proxy-httpd" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.802142 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="ceilometer-notification-agent" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.802160 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" containerName="sg-core" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.803735 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.811626 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.841084 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.841495 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.841888 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.851374 4763 scope.go:117] "RemoveContainer" containerID="54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.891479 4763 scope.go:117] "RemoveContainer" containerID="1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb" Dec 01 09:36:05 crc kubenswrapper[4763]: E1201 09:36:05.891836 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb\": container with ID starting with 1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb not found: ID does not exist" containerID="1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.891881 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb"} err="failed to get container status \"1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb\": rpc error: code = NotFound desc = could not find container \"1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb\": container with ID starting with 1ab50466d36378e4f072049bd730aefd88ed822737388c38cf31c023fcceeebb not found: ID does not exist" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.891904 4763 scope.go:117] "RemoveContainer" containerID="cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b" Dec 01 09:36:05 crc kubenswrapper[4763]: E1201 09:36:05.892310 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b\": container with ID starting with cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b not found: ID does not exist" containerID="cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.892355 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b"} err="failed to get container status \"cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b\": rpc error: code = NotFound desc = could not find container \"cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b\": container with ID starting with cb630543243ecc2060b9cd0c9096ff59251a2e49180c7c063e34ba5c499aeb1b not found: ID does not exist" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.892387 4763 scope.go:117] "RemoveContainer" containerID="94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa" Dec 01 09:36:05 crc kubenswrapper[4763]: E1201 09:36:05.892886 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa\": container with ID starting with 94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa not found: ID does not exist" containerID="94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.892911 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa"} err="failed to get container status \"94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa\": rpc error: code = NotFound desc = could not find container \"94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa\": container with ID starting with 94d54a1b549a9ae9411680ed331a8f385d66bf2d99b59b383ab37cdedaa00caa not found: ID does not exist" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.892931 4763 scope.go:117] "RemoveContainer" containerID="54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62" Dec 01 09:36:05 crc kubenswrapper[4763]: E1201 09:36:05.893381 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62\": container with ID starting with 54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62 not found: ID does not exist" containerID="54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.893413 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62"} err="failed to get container status \"54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62\": rpc error: code = NotFound desc = could not find container \"54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62\": container with ID starting with 54897bc0daf49d719f8ce05851539745170ef0efe3ba9732cb910afba7b72a62 not found: ID does not exist" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.941483 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.941527 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-run-httpd\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.941645 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-config-data\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.941728 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-scripts\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.941800 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.941865 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-log-httpd\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.941914 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnpnv\" (UniqueName: \"kubernetes.io/projected/08ea8b4b-223e-495d-a145-27346ad0862f-kube-api-access-rnpnv\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.941974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.955147 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 09:36:05 crc kubenswrapper[4763]: I1201 09:36:05.983751 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.043348 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-config-data\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.043402 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-scripts\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.043485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.043525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-log-httpd\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.043542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnpnv\" (UniqueName: \"kubernetes.io/projected/08ea8b4b-223e-495d-a145-27346ad0862f-kube-api-access-rnpnv\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.043565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.043639 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.043661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-run-httpd\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.045107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-run-httpd\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.045983 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-log-httpd\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.051805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-config-data\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.053674 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.054203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.055000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.062132 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-scripts\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.072467 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnpnv\" (UniqueName: \"kubernetes.io/projected/08ea8b4b-223e-495d-a145-27346ad0862f-kube-api-access-rnpnv\") pod \"ceilometer-0\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.151937 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.603343 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:06 crc kubenswrapper[4763]: W1201 09:36:06.605899 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08ea8b4b_223e_495d_a145_27346ad0862f.slice/crio-74534a738636bcd6cd39b4b5e3a08956275cb168646bfc55dcde84ca4841a52e WatchSource:0}: Error finding container 74534a738636bcd6cd39b4b5e3a08956275cb168646bfc55dcde84ca4841a52e: Status 404 returned error can't find the container with id 74534a738636bcd6cd39b4b5e3a08956275cb168646bfc55dcde84ca4841a52e Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.718238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerStarted","Data":"74534a738636bcd6cd39b4b5e3a08956275cb168646bfc55dcde84ca4841a52e"} Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.751318 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.983874 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:36:06 crc kubenswrapper[4763]: I1201 09:36:06.983935 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:36:07 crc kubenswrapper[4763]: I1201 09:36:07.010276 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89a1e87-32e4-48e8-8a9e-02b7e50f81d5" path="/var/lib/kubelet/pods/e89a1e87-32e4-48e8-8a9e-02b7e50f81d5/volumes" Dec 01 09:36:07 crc kubenswrapper[4763]: I1201 09:36:07.728722 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerStarted","Data":"e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938"} Dec 01 09:36:08 crc kubenswrapper[4763]: I1201 09:36:08.066638 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:36:08 crc kubenswrapper[4763]: I1201 09:36:08.066690 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:36:08 crc kubenswrapper[4763]: I1201 09:36:08.738526 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerStarted","Data":"f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905"} Dec 01 09:36:09 crc kubenswrapper[4763]: I1201 09:36:09.747836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerStarted","Data":"856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72"} Dec 01 09:36:11 crc kubenswrapper[4763]: I1201 09:36:11.066030 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 01 09:36:11 crc kubenswrapper[4763]: I1201 09:36:11.768316 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerStarted","Data":"1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf"} Dec 01 09:36:11 crc kubenswrapper[4763]: I1201 09:36:11.769118 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:36:11 crc kubenswrapper[4763]: I1201 09:36:11.792632 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.453709706 podStartE2EDuration="6.792612546s" podCreationTimestamp="2025-12-01 09:36:05 +0000 UTC" firstStartedPulling="2025-12-01 09:36:06.609707336 +0000 UTC m=+1283.878356104" lastFinishedPulling="2025-12-01 09:36:10.948610176 +0000 UTC m=+1288.217258944" observedRunningTime="2025-12-01 09:36:11.791598648 +0000 UTC m=+1289.060247416" watchObservedRunningTime="2025-12-01 09:36:11.792612546 +0000 UTC m=+1289.061261314" Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.792695 4763 generic.go:334] "Generic (PLEG): container finished" podID="649aeb71-482b-4683-a727-252060682032" containerID="76a45d3100b95b0c0e254da3df01d201dca0ba6fad244c10c1ae0b8b5ff57cb4" exitCode=137 Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.793088 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"649aeb71-482b-4683-a727-252060682032","Type":"ContainerDied","Data":"76a45d3100b95b0c0e254da3df01d201dca0ba6fad244c10c1ae0b8b5ff57cb4"} Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.794371 4763 generic.go:334] "Generic (PLEG): container finished" podID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerID="650648affd993e2501d41d603948c620e6aa62ffcae2148d65199a5089239139" exitCode=137 Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.794396 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47b6286d-535e-4fb8-81c0-b30b0d9b151a","Type":"ContainerDied","Data":"650648affd993e2501d41d603948c620e6aa62ffcae2148d65199a5089239139"} Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.966263 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.970695 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.987676 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kxf8\" (UniqueName: \"kubernetes.io/projected/649aeb71-482b-4683-a727-252060682032-kube-api-access-7kxf8\") pod \"649aeb71-482b-4683-a727-252060682032\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.987739 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chzr6\" (UniqueName: \"kubernetes.io/projected/47b6286d-535e-4fb8-81c0-b30b0d9b151a-kube-api-access-chzr6\") pod \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.987771 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b6286d-535e-4fb8-81c0-b30b0d9b151a-logs\") pod \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.987837 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-config-data\") pod \"649aeb71-482b-4683-a727-252060682032\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.987897 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-combined-ca-bundle\") pod \"649aeb71-482b-4683-a727-252060682032\" (UID: \"649aeb71-482b-4683-a727-252060682032\") " Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.987917 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-combined-ca-bundle\") pod \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.987986 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-config-data\") pod \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\" (UID: \"47b6286d-535e-4fb8-81c0-b30b0d9b151a\") " Dec 01 09:36:13 crc kubenswrapper[4763]: I1201 09:36:13.990252 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47b6286d-535e-4fb8-81c0-b30b0d9b151a-logs" (OuterVolumeSpecName: "logs") pod "47b6286d-535e-4fb8-81c0-b30b0d9b151a" (UID: "47b6286d-535e-4fb8-81c0-b30b0d9b151a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.015014 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649aeb71-482b-4683-a727-252060682032-kube-api-access-7kxf8" (OuterVolumeSpecName: "kube-api-access-7kxf8") pod "649aeb71-482b-4683-a727-252060682032" (UID: "649aeb71-482b-4683-a727-252060682032"). InnerVolumeSpecName "kube-api-access-7kxf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.019350 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "649aeb71-482b-4683-a727-252060682032" (UID: "649aeb71-482b-4683-a727-252060682032"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.026598 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47b6286d-535e-4fb8-81c0-b30b0d9b151a-kube-api-access-chzr6" (OuterVolumeSpecName: "kube-api-access-chzr6") pod "47b6286d-535e-4fb8-81c0-b30b0d9b151a" (UID: "47b6286d-535e-4fb8-81c0-b30b0d9b151a"). InnerVolumeSpecName "kube-api-access-chzr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.028011 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-config-data" (OuterVolumeSpecName: "config-data") pod "47b6286d-535e-4fb8-81c0-b30b0d9b151a" (UID: "47b6286d-535e-4fb8-81c0-b30b0d9b151a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.029943 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47b6286d-535e-4fb8-81c0-b30b0d9b151a" (UID: "47b6286d-535e-4fb8-81c0-b30b0d9b151a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.057082 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-config-data" (OuterVolumeSpecName: "config-data") pod "649aeb71-482b-4683-a727-252060682032" (UID: "649aeb71-482b-4683-a727-252060682032"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.093001 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kxf8\" (UniqueName: \"kubernetes.io/projected/649aeb71-482b-4683-a727-252060682032-kube-api-access-7kxf8\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.093220 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chzr6\" (UniqueName: \"kubernetes.io/projected/47b6286d-535e-4fb8-81c0-b30b0d9b151a-kube-api-access-chzr6\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.093307 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47b6286d-535e-4fb8-81c0-b30b0d9b151a-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.093378 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.093436 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649aeb71-482b-4683-a727-252060682032-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.093514 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.093581 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47b6286d-535e-4fb8-81c0-b30b0d9b151a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.805596 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"47b6286d-535e-4fb8-81c0-b30b0d9b151a","Type":"ContainerDied","Data":"47a4146792a91e2e5acaad40f0f0bf4fa148d95f0533ac45ce82bf996c7bcb10"} Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.805660 4763 scope.go:117] "RemoveContainer" containerID="650648affd993e2501d41d603948c620e6aa62ffcae2148d65199a5089239139" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.805670 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.823280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"649aeb71-482b-4683-a727-252060682032","Type":"ContainerDied","Data":"40f7dac49d75dd7b844ac7e8ad449dade62a0c1145aa1e12974bc44790056fe1"} Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.823335 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.849776 4763 scope.go:117] "RemoveContainer" containerID="b0c1d0110fc1838000b9dec9259d57fdf2c821d3d77dde3a7ed5621cf6fd0bc6" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.864654 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.879761 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.888168 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.909384 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.916853 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:14 crc kubenswrapper[4763]: E1201 09:36:14.917340 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerName="nova-metadata-log" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.917368 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerName="nova-metadata-log" Dec 01 09:36:14 crc kubenswrapper[4763]: E1201 09:36:14.917392 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerName="nova-metadata-metadata" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.917400 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerName="nova-metadata-metadata" Dec 01 09:36:14 crc kubenswrapper[4763]: E1201 09:36:14.917434 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649aeb71-482b-4683-a727-252060682032" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.917443 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="649aeb71-482b-4683-a727-252060682032" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.917683 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="649aeb71-482b-4683-a727-252060682032" containerName="nova-cell1-novncproxy-novncproxy" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.917709 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerName="nova-metadata-metadata" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.917730 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" containerName="nova-metadata-log" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.918892 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.924916 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.925286 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.929374 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.930849 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.937436 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.937721 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.937969 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.939676 4763 scope.go:117] "RemoveContainer" containerID="76a45d3100b95b0c0e254da3df01d201dca0ba6fad244c10c1ae0b8b5ff57cb4" Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.954658 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:14 crc kubenswrapper[4763]: I1201 09:36:14.966773 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007190 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007284 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007315 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qrkt\" (UniqueName: \"kubernetes.io/projected/80db5535-a70c-4d66-9946-18192483b360-kube-api-access-7qrkt\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007359 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-config-data\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007412 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007450 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007511 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007540 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007604 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80db5535-a70c-4d66-9946-18192483b360-logs\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.007681 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tmq\" (UniqueName: \"kubernetes.io/projected/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-kube-api-access-k5tmq\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.015514 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47b6286d-535e-4fb8-81c0-b30b0d9b151a" path="/var/lib/kubelet/pods/47b6286d-535e-4fb8-81c0-b30b0d9b151a/volumes" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.016227 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649aeb71-482b-4683-a727-252060682032" path="/var/lib/kubelet/pods/649aeb71-482b-4683-a727-252060682032/volumes" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109242 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tmq\" (UniqueName: \"kubernetes.io/projected/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-kube-api-access-k5tmq\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109308 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109360 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qrkt\" (UniqueName: \"kubernetes.io/projected/80db5535-a70c-4d66-9946-18192483b360-kube-api-access-7qrkt\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109435 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-config-data\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109489 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109511 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109544 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109599 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.109642 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80db5535-a70c-4d66-9946-18192483b360-logs\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.110153 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80db5535-a70c-4d66-9946-18192483b360-logs\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.114502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.115037 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.115138 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.118044 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.125173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-config-data\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.129053 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.129507 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.135125 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qrkt\" (UniqueName: \"kubernetes.io/projected/80db5535-a70c-4d66-9946-18192483b360-kube-api-access-7qrkt\") pod \"nova-metadata-0\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.135670 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5tmq\" (UniqueName: \"kubernetes.io/projected/34e5c16d-5a9f-43f0-a2ba-ca4a768891a7-kube-api-access-k5tmq\") pod \"nova-cell1-novncproxy-0\" (UID: \"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.251765 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.264393 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.712361 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.801479 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.849326 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80db5535-a70c-4d66-9946-18192483b360","Type":"ContainerStarted","Data":"eb6081a6ae3e26484c3485ed94c1150bf415db62506a87a4f8d0e01249f26d01"} Dec 01 09:36:15 crc kubenswrapper[4763]: I1201 09:36:15.851264 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7","Type":"ContainerStarted","Data":"c5ea89f847ad49ff53d79ad029a5ce3c2c02eb8199c9c84bbb83977cc23fa046"} Dec 01 09:36:16 crc kubenswrapper[4763]: I1201 09:36:16.869370 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34e5c16d-5a9f-43f0-a2ba-ca4a768891a7","Type":"ContainerStarted","Data":"6f881618c33a7e6dbe35bb980ee21743850353ae64212e34a05ee7d4441cee4c"} Dec 01 09:36:16 crc kubenswrapper[4763]: I1201 09:36:16.872359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80db5535-a70c-4d66-9946-18192483b360","Type":"ContainerStarted","Data":"6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab"} Dec 01 09:36:16 crc kubenswrapper[4763]: I1201 09:36:16.873377 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80db5535-a70c-4d66-9946-18192483b360","Type":"ContainerStarted","Data":"1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f"} Dec 01 09:36:16 crc kubenswrapper[4763]: I1201 09:36:16.898046 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.898026422 podStartE2EDuration="2.898026422s" podCreationTimestamp="2025-12-01 09:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:36:16.890550361 +0000 UTC m=+1294.159199129" watchObservedRunningTime="2025-12-01 09:36:16.898026422 +0000 UTC m=+1294.166675190" Dec 01 09:36:16 crc kubenswrapper[4763]: I1201 09:36:16.915356 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.915310179 podStartE2EDuration="2.915310179s" podCreationTimestamp="2025-12-01 09:36:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:36:16.908065444 +0000 UTC m=+1294.176714222" watchObservedRunningTime="2025-12-01 09:36:16.915310179 +0000 UTC m=+1294.183958947" Dec 01 09:36:16 crc kubenswrapper[4763]: I1201 09:36:16.989906 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 09:36:16 crc kubenswrapper[4763]: I1201 09:36:16.990479 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 09:36:16 crc kubenswrapper[4763]: I1201 09:36:16.990714 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 09:36:17 crc kubenswrapper[4763]: I1201 09:36:17.020059 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 09:36:17 crc kubenswrapper[4763]: I1201 09:36:17.882212 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 09:36:17 crc kubenswrapper[4763]: I1201 09:36:17.886337 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.095522 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-752rw"] Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.097081 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.115169 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-752rw"] Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.266816 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6d5q\" (UniqueName: \"kubernetes.io/projected/2645413f-063f-475a-860d-1876987608fc-kube-api-access-z6d5q\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.266874 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-config\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.267007 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.267078 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.267107 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.369313 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6d5q\" (UniqueName: \"kubernetes.io/projected/2645413f-063f-475a-860d-1876987608fc-kube-api-access-z6d5q\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.369388 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-config\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.369480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.369543 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.369578 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.370293 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-config\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.370636 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.370665 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.370959 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.388406 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6d5q\" (UniqueName: \"kubernetes.io/projected/2645413f-063f-475a-860d-1876987608fc-kube-api-access-z6d5q\") pod \"dnsmasq-dns-68d4b6d797-752rw\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.428255 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:18 crc kubenswrapper[4763]: I1201 09:36:18.943023 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-752rw"] Dec 01 09:36:19 crc kubenswrapper[4763]: I1201 09:36:19.899912 4763 generic.go:334] "Generic (PLEG): container finished" podID="2645413f-063f-475a-860d-1876987608fc" containerID="56c775aeb8e9ca3117c6d30148a8320316881e3d9174537a3d54c05d55b758b2" exitCode=0 Dec 01 09:36:19 crc kubenswrapper[4763]: I1201 09:36:19.899971 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" event={"ID":"2645413f-063f-475a-860d-1876987608fc","Type":"ContainerDied","Data":"56c775aeb8e9ca3117c6d30148a8320316881e3d9174537a3d54c05d55b758b2"} Dec 01 09:36:19 crc kubenswrapper[4763]: I1201 09:36:19.900680 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" event={"ID":"2645413f-063f-475a-860d-1876987608fc","Type":"ContainerStarted","Data":"80abce437aed7b3418d940f2bc405af5804e6de83390fd275b37984066b06738"} Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.252994 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.253073 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.265258 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.429206 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.763541 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.764042 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="ceilometer-central-agent" containerID="cri-o://e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938" gracePeriod=30 Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.764101 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="sg-core" containerID="cri-o://856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72" gracePeriod=30 Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.764153 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="ceilometer-notification-agent" containerID="cri-o://f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905" gracePeriod=30 Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.764181 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="proxy-httpd" containerID="cri-o://1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf" gracePeriod=30 Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.920260 4763 generic.go:334] "Generic (PLEG): container finished" podID="08ea8b4b-223e-495d-a145-27346ad0862f" containerID="856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72" exitCode=2 Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.920300 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerDied","Data":"856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72"} Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.924336 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" event={"ID":"2645413f-063f-475a-860d-1876987608fc","Type":"ContainerStarted","Data":"e05f333bcc113aef0126f257b2ca7bd48d1d5c57a7d914764929d1032c3b8f1a"} Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.924571 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-log" containerID="cri-o://a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853" gracePeriod=30 Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.925852 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.926021 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-api" containerID="cri-o://b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81" gracePeriod=30 Dec 01 09:36:20 crc kubenswrapper[4763]: I1201 09:36:20.964211 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" podStartSLOduration=2.964194864 podStartE2EDuration="2.964194864s" podCreationTimestamp="2025-12-01 09:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:36:20.954716717 +0000 UTC m=+1298.223365485" watchObservedRunningTime="2025-12-01 09:36:20.964194864 +0000 UTC m=+1298.232843632" Dec 01 09:36:21 crc kubenswrapper[4763]: I1201 09:36:21.933359 4763 generic.go:334] "Generic (PLEG): container finished" podID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerID="a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853" exitCode=143 Dec 01 09:36:21 crc kubenswrapper[4763]: I1201 09:36:21.933430 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3687bcb9-8c1d-4f52-bc92-237681b9c5ed","Type":"ContainerDied","Data":"a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853"} Dec 01 09:36:21 crc kubenswrapper[4763]: I1201 09:36:21.936914 4763 generic.go:334] "Generic (PLEG): container finished" podID="08ea8b4b-223e-495d-a145-27346ad0862f" containerID="1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf" exitCode=0 Dec 01 09:36:21 crc kubenswrapper[4763]: I1201 09:36:21.936949 4763 generic.go:334] "Generic (PLEG): container finished" podID="08ea8b4b-223e-495d-a145-27346ad0862f" containerID="e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938" exitCode=0 Dec 01 09:36:21 crc kubenswrapper[4763]: I1201 09:36:21.937163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerDied","Data":"1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf"} Dec 01 09:36:21 crc kubenswrapper[4763]: I1201 09:36:21.937228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerDied","Data":"e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938"} Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.464929 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.594574 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-config-data\") pod \"08ea8b4b-223e-495d-a145-27346ad0862f\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.595621 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-run-httpd\") pod \"08ea8b4b-223e-495d-a145-27346ad0862f\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.595799 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnpnv\" (UniqueName: \"kubernetes.io/projected/08ea8b4b-223e-495d-a145-27346ad0862f-kube-api-access-rnpnv\") pod \"08ea8b4b-223e-495d-a145-27346ad0862f\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.595925 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-combined-ca-bundle\") pod \"08ea8b4b-223e-495d-a145-27346ad0862f\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.596046 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-ceilometer-tls-certs\") pod \"08ea8b4b-223e-495d-a145-27346ad0862f\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.596125 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-scripts\") pod \"08ea8b4b-223e-495d-a145-27346ad0862f\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.595962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "08ea8b4b-223e-495d-a145-27346ad0862f" (UID: "08ea8b4b-223e-495d-a145-27346ad0862f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.596634 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-log-httpd\") pod \"08ea8b4b-223e-495d-a145-27346ad0862f\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.597086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-sg-core-conf-yaml\") pod \"08ea8b4b-223e-495d-a145-27346ad0862f\" (UID: \"08ea8b4b-223e-495d-a145-27346ad0862f\") " Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.596989 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "08ea8b4b-223e-495d-a145-27346ad0862f" (UID: "08ea8b4b-223e-495d-a145-27346ad0862f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.597749 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.597821 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08ea8b4b-223e-495d-a145-27346ad0862f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.600344 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-scripts" (OuterVolumeSpecName: "scripts") pod "08ea8b4b-223e-495d-a145-27346ad0862f" (UID: "08ea8b4b-223e-495d-a145-27346ad0862f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.600419 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ea8b4b-223e-495d-a145-27346ad0862f-kube-api-access-rnpnv" (OuterVolumeSpecName: "kube-api-access-rnpnv") pod "08ea8b4b-223e-495d-a145-27346ad0862f" (UID: "08ea8b4b-223e-495d-a145-27346ad0862f"). InnerVolumeSpecName "kube-api-access-rnpnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.642358 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "08ea8b4b-223e-495d-a145-27346ad0862f" (UID: "08ea8b4b-223e-495d-a145-27346ad0862f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.658355 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "08ea8b4b-223e-495d-a145-27346ad0862f" (UID: "08ea8b4b-223e-495d-a145-27346ad0862f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.687665 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08ea8b4b-223e-495d-a145-27346ad0862f" (UID: "08ea8b4b-223e-495d-a145-27346ad0862f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.698424 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnpnv\" (UniqueName: \"kubernetes.io/projected/08ea8b4b-223e-495d-a145-27346ad0862f-kube-api-access-rnpnv\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.698627 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.698731 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.698816 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.698878 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.735361 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-config-data" (OuterVolumeSpecName: "config-data") pod "08ea8b4b-223e-495d-a145-27346ad0862f" (UID: "08ea8b4b-223e-495d-a145-27346ad0862f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.799415 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ea8b4b-223e-495d-a145-27346ad0862f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.953155 4763 generic.go:334] "Generic (PLEG): container finished" podID="08ea8b4b-223e-495d-a145-27346ad0862f" containerID="f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905" exitCode=0 Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.953214 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.953228 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerDied","Data":"f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905"} Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.954419 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08ea8b4b-223e-495d-a145-27346ad0862f","Type":"ContainerDied","Data":"74534a738636bcd6cd39b4b5e3a08956275cb168646bfc55dcde84ca4841a52e"} Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.954476 4763 scope.go:117] "RemoveContainer" containerID="1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf" Dec 01 09:36:22 crc kubenswrapper[4763]: I1201 09:36:22.984904 4763 scope.go:117] "RemoveContainer" containerID="856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.016901 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.026538 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.026966 4763 scope.go:117] "RemoveContainer" containerID="f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.037678 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:23 crc kubenswrapper[4763]: E1201 09:36:23.038256 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="proxy-httpd" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.038378 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="proxy-httpd" Dec 01 09:36:23 crc kubenswrapper[4763]: E1201 09:36:23.038493 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="ceilometer-notification-agent" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.038566 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="ceilometer-notification-agent" Dec 01 09:36:23 crc kubenswrapper[4763]: E1201 09:36:23.038647 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="ceilometer-central-agent" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.038725 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="ceilometer-central-agent" Dec 01 09:36:23 crc kubenswrapper[4763]: E1201 09:36:23.038803 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="sg-core" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.038872 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="sg-core" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.039091 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="proxy-httpd" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.039157 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="ceilometer-notification-agent" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.039217 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="ceilometer-central-agent" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.039278 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" containerName="sg-core" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.040825 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.050120 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.069652 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.070091 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.070395 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.084202 4763 scope.go:117] "RemoveContainer" containerID="e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.142904 4763 scope.go:117] "RemoveContainer" containerID="1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf" Dec 01 09:36:23 crc kubenswrapper[4763]: E1201 09:36:23.143381 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf\": container with ID starting with 1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf not found: ID does not exist" containerID="1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.143422 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf"} err="failed to get container status \"1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf\": rpc error: code = NotFound desc = could not find container \"1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf\": container with ID starting with 1ffb560e0755365560c66055a07a93e51dd8a9438f616663cd078a732de5fabf not found: ID does not exist" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.143474 4763 scope.go:117] "RemoveContainer" containerID="856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72" Dec 01 09:36:23 crc kubenswrapper[4763]: E1201 09:36:23.143998 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72\": container with ID starting with 856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72 not found: ID does not exist" containerID="856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.144046 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72"} err="failed to get container status \"856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72\": rpc error: code = NotFound desc = could not find container \"856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72\": container with ID starting with 856fbc14984b0dc1f63fb67a00d4cd0f98ecedefda3994e146b5ea20af476d72 not found: ID does not exist" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.144076 4763 scope.go:117] "RemoveContainer" containerID="f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905" Dec 01 09:36:23 crc kubenswrapper[4763]: E1201 09:36:23.144527 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905\": container with ID starting with f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905 not found: ID does not exist" containerID="f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.144573 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905"} err="failed to get container status \"f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905\": rpc error: code = NotFound desc = could not find container \"f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905\": container with ID starting with f17fa9ae2f397427c774420536f5a3d508ed626e1bda6f85b8b4ad542c9e7905 not found: ID does not exist" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.144602 4763 scope.go:117] "RemoveContainer" containerID="e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938" Dec 01 09:36:23 crc kubenswrapper[4763]: E1201 09:36:23.144954 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938\": container with ID starting with e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938 not found: ID does not exist" containerID="e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.144979 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938"} err="failed to get container status \"e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938\": rpc error: code = NotFound desc = could not find container \"e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938\": container with ID starting with e9249d5a9a0a4f8160f2e772df3fcd8993d402c32d18797802c5ddcb145c5938 not found: ID does not exist" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.211618 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.212025 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-config-data\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.212067 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-scripts\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.212102 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t46cz\" (UniqueName: \"kubernetes.io/projected/75e5bac3-096d-42dd-b1f8-19c03774fb1c-kube-api-access-t46cz\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.212151 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.212183 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.212207 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-log-httpd\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.212257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-run-httpd\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.313891 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.313952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.313983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-log-httpd\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.314038 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-run-httpd\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.314087 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.314104 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-config-data\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.314133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-scripts\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.314158 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t46cz\" (UniqueName: \"kubernetes.io/projected/75e5bac3-096d-42dd-b1f8-19c03774fb1c-kube-api-access-t46cz\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.314815 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-log-httpd\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.314833 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-run-httpd\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.318276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.318281 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-scripts\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.318747 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.319229 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-config-data\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.332108 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.333867 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t46cz\" (UniqueName: \"kubernetes.io/projected/75e5bac3-096d-42dd-b1f8-19c03774fb1c-kube-api-access-t46cz\") pod \"ceilometer-0\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.436118 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.895374 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 09:36:23 crc kubenswrapper[4763]: W1201 09:36:23.895533 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75e5bac3_096d_42dd_b1f8_19c03774fb1c.slice/crio-31c14605f1ffe1886f54bf07784492cdcd4de94049cc402a9d1401b41b276ae0 WatchSource:0}: Error finding container 31c14605f1ffe1886f54bf07784492cdcd4de94049cc402a9d1401b41b276ae0: Status 404 returned error can't find the container with id 31c14605f1ffe1886f54bf07784492cdcd4de94049cc402a9d1401b41b276ae0 Dec 01 09:36:23 crc kubenswrapper[4763]: I1201 09:36:23.964096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerStarted","Data":"31c14605f1ffe1886f54bf07784492cdcd4de94049cc402a9d1401b41b276ae0"} Dec 01 09:36:24 crc kubenswrapper[4763]: E1201 09:36:24.334356 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3687bcb9_8c1d_4f52_bc92_237681b9c5ed.slice/crio-conmon-b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81.scope\": RecentStats: unable to find data in memory cache]" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.492842 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.566052 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-logs\") pod \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.566192 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d85rm\" (UniqueName: \"kubernetes.io/projected/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-kube-api-access-d85rm\") pod \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.566319 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-config-data\") pod \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.566361 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-combined-ca-bundle\") pod \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\" (UID: \"3687bcb9-8c1d-4f52-bc92-237681b9c5ed\") " Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.566757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-logs" (OuterVolumeSpecName: "logs") pod "3687bcb9-8c1d-4f52-bc92-237681b9c5ed" (UID: "3687bcb9-8c1d-4f52-bc92-237681b9c5ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.566921 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.570929 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-kube-api-access-d85rm" (OuterVolumeSpecName: "kube-api-access-d85rm") pod "3687bcb9-8c1d-4f52-bc92-237681b9c5ed" (UID: "3687bcb9-8c1d-4f52-bc92-237681b9c5ed"). InnerVolumeSpecName "kube-api-access-d85rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.617326 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3687bcb9-8c1d-4f52-bc92-237681b9c5ed" (UID: "3687bcb9-8c1d-4f52-bc92-237681b9c5ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.621965 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-config-data" (OuterVolumeSpecName: "config-data") pod "3687bcb9-8c1d-4f52-bc92-237681b9c5ed" (UID: "3687bcb9-8c1d-4f52-bc92-237681b9c5ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.668634 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d85rm\" (UniqueName: \"kubernetes.io/projected/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-kube-api-access-d85rm\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.668674 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.668685 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3687bcb9-8c1d-4f52-bc92-237681b9c5ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.975296 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerStarted","Data":"1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf"} Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.978284 4763 generic.go:334] "Generic (PLEG): container finished" podID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerID="b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81" exitCode=0 Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.978328 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3687bcb9-8c1d-4f52-bc92-237681b9c5ed","Type":"ContainerDied","Data":"b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81"} Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.978358 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3687bcb9-8c1d-4f52-bc92-237681b9c5ed","Type":"ContainerDied","Data":"6800ce83cc77aabe59c418bd6709bdd3c4c0a70d7573c2d8cc288071115a726a"} Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.978332 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.978376 4763 scope.go:117] "RemoveContainer" containerID="b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81" Dec 01 09:36:24 crc kubenswrapper[4763]: I1201 09:36:24.996912 4763 scope.go:117] "RemoveContainer" containerID="a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.010318 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ea8b4b-223e-495d-a145-27346ad0862f" path="/var/lib/kubelet/pods/08ea8b4b-223e-495d-a145-27346ad0862f/volumes" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.039120 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.071588 4763 scope.go:117] "RemoveContainer" containerID="b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.071799 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:25 crc kubenswrapper[4763]: E1201 09:36:25.073065 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81\": container with ID starting with b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81 not found: ID does not exist" containerID="b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.073179 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81"} err="failed to get container status \"b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81\": rpc error: code = NotFound desc = could not find container \"b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81\": container with ID starting with b851079e1f1dc0e511833c67d01ec2b8f2b60ed8de6746851f8322ee33d9dd81 not found: ID does not exist" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.073276 4763 scope.go:117] "RemoveContainer" containerID="a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853" Dec 01 09:36:25 crc kubenswrapper[4763]: E1201 09:36:25.074314 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853\": container with ID starting with a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853 not found: ID does not exist" containerID="a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.074355 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853"} err="failed to get container status \"a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853\": rpc error: code = NotFound desc = could not find container \"a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853\": container with ID starting with a108c9688da9f3e4a4ad96ccb804bd330d4b83c6b02f7088219bbacd4f43f853 not found: ID does not exist" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.082075 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:25 crc kubenswrapper[4763]: E1201 09:36:25.083575 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-api" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.083724 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-api" Dec 01 09:36:25 crc kubenswrapper[4763]: E1201 09:36:25.083807 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-log" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.083877 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-log" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.084209 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-api" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.084343 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" containerName="nova-api-log" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.090278 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.094000 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.094267 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.094583 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.099486 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.178307 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.178381 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.178418 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-public-tls-certs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.178578 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtt4c\" (UniqueName: \"kubernetes.io/projected/664c10fa-d5c1-4e90-80f3-9933f73a4db3-kube-api-access-wtt4c\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.178637 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/664c10fa-d5c1-4e90-80f3-9933f73a4db3-logs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.178696 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-config-data\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.254628 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.255157 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.266149 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.280983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/664c10fa-d5c1-4e90-80f3-9933f73a4db3-logs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.281695 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-config-data\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.281962 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.282072 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.282216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-public-tls-certs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.282442 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtt4c\" (UniqueName: \"kubernetes.io/projected/664c10fa-d5c1-4e90-80f3-9933f73a4db3-kube-api-access-wtt4c\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.281696 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/664c10fa-d5c1-4e90-80f3-9933f73a4db3-logs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.293800 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-public-tls-certs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.300240 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-config-data\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.300542 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.302857 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.309180 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.311360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtt4c\" (UniqueName: \"kubernetes.io/projected/664c10fa-d5c1-4e90-80f3-9933f73a4db3-kube-api-access-wtt4c\") pod \"nova-api-0\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.470038 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.945853 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:25 crc kubenswrapper[4763]: W1201 09:36:25.947979 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod664c10fa_d5c1_4e90_80f3_9933f73a4db3.slice/crio-68d5ccd3f5adad600583b674019e625f7e25bc7b3c97db5ad50da36d615564e2 WatchSource:0}: Error finding container 68d5ccd3f5adad600583b674019e625f7e25bc7b3c97db5ad50da36d615564e2: Status 404 returned error can't find the container with id 68d5ccd3f5adad600583b674019e625f7e25bc7b3c97db5ad50da36d615564e2 Dec 01 09:36:25 crc kubenswrapper[4763]: I1201 09:36:25.994937 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"664c10fa-d5c1-4e90-80f3-9933f73a4db3","Type":"ContainerStarted","Data":"68d5ccd3f5adad600583b674019e625f7e25bc7b3c97db5ad50da36d615564e2"} Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.007083 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerStarted","Data":"80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2"} Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.026664 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.233962 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6nsvb"] Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.235286 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.255918 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.256204 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.265182 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6nsvb"] Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.269633 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.180:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.269896 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.180:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.311107 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-config-data\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.311148 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.311219 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-scripts\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.311250 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh68p\" (UniqueName: \"kubernetes.io/projected/af83b033-4df7-4f11-bf61-ab1addfeb933-kube-api-access-vh68p\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.412721 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-config-data\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.412763 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.412833 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-scripts\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.412862 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh68p\" (UniqueName: \"kubernetes.io/projected/af83b033-4df7-4f11-bf61-ab1addfeb933-kube-api-access-vh68p\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.417799 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-config-data\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.418592 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-scripts\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.419562 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.434924 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh68p\" (UniqueName: \"kubernetes.io/projected/af83b033-4df7-4f11-bf61-ab1addfeb933-kube-api-access-vh68p\") pod \"nova-cell1-cell-mapping-6nsvb\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:26 crc kubenswrapper[4763]: I1201 09:36:26.660886 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:27 crc kubenswrapper[4763]: I1201 09:36:27.009090 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3687bcb9-8c1d-4f52-bc92-237681b9c5ed" path="/var/lib/kubelet/pods/3687bcb9-8c1d-4f52-bc92-237681b9c5ed/volumes" Dec 01 09:36:27 crc kubenswrapper[4763]: I1201 09:36:27.018811 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerStarted","Data":"1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e"} Dec 01 09:36:27 crc kubenswrapper[4763]: I1201 09:36:27.022441 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"664c10fa-d5c1-4e90-80f3-9933f73a4db3","Type":"ContainerStarted","Data":"dd9c61740051e58d77cdcfab86e5eaa3b2d704cafb181d8664e8da2be0498b42"} Dec 01 09:36:27 crc kubenswrapper[4763]: I1201 09:36:27.022519 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"664c10fa-d5c1-4e90-80f3-9933f73a4db3","Type":"ContainerStarted","Data":"43db7f4282f3c4f7cff725f26f345432c8c6593fbfb96944b1b4e284cfea8d7f"} Dec 01 09:36:27 crc kubenswrapper[4763]: I1201 09:36:27.045168 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.045147393 podStartE2EDuration="2.045147393s" podCreationTimestamp="2025-12-01 09:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:36:27.044565209 +0000 UTC m=+1304.313213997" watchObservedRunningTime="2025-12-01 09:36:27.045147393 +0000 UTC m=+1304.313796161" Dec 01 09:36:27 crc kubenswrapper[4763]: I1201 09:36:27.129132 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6nsvb"] Dec 01 09:36:28 crc kubenswrapper[4763]: I1201 09:36:28.033552 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6nsvb" event={"ID":"af83b033-4df7-4f11-bf61-ab1addfeb933","Type":"ContainerStarted","Data":"867ac822b4d99d98ebbfbf598f7e8b3ab7c2cef8ddc81860f77880cbba820179"} Dec 01 09:36:28 crc kubenswrapper[4763]: I1201 09:36:28.033826 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6nsvb" event={"ID":"af83b033-4df7-4f11-bf61-ab1addfeb933","Type":"ContainerStarted","Data":"1bebd3731d386b043fed6f9fb2f609afd89e2874d1b64d9a7177c4d1efac2e18"} Dec 01 09:36:28 crc kubenswrapper[4763]: I1201 09:36:28.058594 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6nsvb" podStartSLOduration=2.05857797 podStartE2EDuration="2.05857797s" podCreationTimestamp="2025-12-01 09:36:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:36:28.04988758 +0000 UTC m=+1305.318536348" watchObservedRunningTime="2025-12-01 09:36:28.05857797 +0000 UTC m=+1305.327226728" Dec 01 09:36:28 crc kubenswrapper[4763]: I1201 09:36:28.429634 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:36:28 crc kubenswrapper[4763]: I1201 09:36:28.518022 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gzcx2"] Dec 01 09:36:28 crc kubenswrapper[4763]: I1201 09:36:28.518649 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" podUID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" containerName="dnsmasq-dns" containerID="cri-o://ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8" gracePeriod=10 Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.022568 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.042140 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" containerID="ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8" exitCode=0 Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.042216 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" event={"ID":"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e","Type":"ContainerDied","Data":"ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8"} Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.042250 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" event={"ID":"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e","Type":"ContainerDied","Data":"6be91e1bbb2433608453434cd4e1ed5224bf7073bdca26a3f8fbf25bb3d72372"} Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.042271 4763 scope.go:117] "RemoveContainer" containerID="ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.042365 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-gzcx2" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.051605 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerStarted","Data":"31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0"} Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.052349 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.079184 4763 scope.go:117] "RemoveContainer" containerID="4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.097485 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-config\") pod \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.097570 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-dns-svc\") pod \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.097681 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-nb\") pod \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.097707 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kgtd\" (UniqueName: \"kubernetes.io/projected/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-kube-api-access-7kgtd\") pod \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.097730 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-sb\") pod \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\" (UID: \"a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e\") " Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.099930 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.938037865 podStartE2EDuration="7.099914875s" podCreationTimestamp="2025-12-01 09:36:22 +0000 UTC" firstStartedPulling="2025-12-01 09:36:23.897779887 +0000 UTC m=+1301.166428655" lastFinishedPulling="2025-12-01 09:36:28.059656897 +0000 UTC m=+1305.328305665" observedRunningTime="2025-12-01 09:36:29.097249957 +0000 UTC m=+1306.365898725" watchObservedRunningTime="2025-12-01 09:36:29.099914875 +0000 UTC m=+1306.368563643" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.127949 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-kube-api-access-7kgtd" (OuterVolumeSpecName: "kube-api-access-7kgtd") pod "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" (UID: "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e"). InnerVolumeSpecName "kube-api-access-7kgtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.134646 4763 scope.go:117] "RemoveContainer" containerID="ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8" Dec 01 09:36:29 crc kubenswrapper[4763]: E1201 09:36:29.141779 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8\": container with ID starting with ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8 not found: ID does not exist" containerID="ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.141824 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8"} err="failed to get container status \"ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8\": rpc error: code = NotFound desc = could not find container \"ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8\": container with ID starting with ef3a26bc8e21014e0879232585f39b05eb0678c7e6fda382085f2cb44a715be8 not found: ID does not exist" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.141849 4763 scope.go:117] "RemoveContainer" containerID="4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df" Dec 01 09:36:29 crc kubenswrapper[4763]: E1201 09:36:29.143810 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df\": container with ID starting with 4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df not found: ID does not exist" containerID="4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.143853 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df"} err="failed to get container status \"4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df\": rpc error: code = NotFound desc = could not find container \"4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df\": container with ID starting with 4849c8f10329a5fa1c23b83bbb3a4d78737911c54a0323e3175a6152c61dd7df not found: ID does not exist" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.199301 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kgtd\" (UniqueName: \"kubernetes.io/projected/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-kube-api-access-7kgtd\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.210582 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-config" (OuterVolumeSpecName: "config") pod "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" (UID: "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.221104 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" (UID: "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.227921 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" (UID: "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.239993 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" (UID: "a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.300504 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.300693 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.300775 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.300833 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.372387 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gzcx2"] Dec 01 09:36:29 crc kubenswrapper[4763]: I1201 09:36:29.380861 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-gzcx2"] Dec 01 09:36:31 crc kubenswrapper[4763]: I1201 09:36:31.014589 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" path="/var/lib/kubelet/pods/a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e/volumes" Dec 01 09:36:33 crc kubenswrapper[4763]: I1201 09:36:33.095508 4763 generic.go:334] "Generic (PLEG): container finished" podID="af83b033-4df7-4f11-bf61-ab1addfeb933" containerID="867ac822b4d99d98ebbfbf598f7e8b3ab7c2cef8ddc81860f77880cbba820179" exitCode=0 Dec 01 09:36:33 crc kubenswrapper[4763]: I1201 09:36:33.095687 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6nsvb" event={"ID":"af83b033-4df7-4f11-bf61-ab1addfeb933","Type":"ContainerDied","Data":"867ac822b4d99d98ebbfbf598f7e8b3ab7c2cef8ddc81860f77880cbba820179"} Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.489035 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.507381 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-config-data\") pod \"af83b033-4df7-4f11-bf61-ab1addfeb933\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.507432 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-scripts\") pod \"af83b033-4df7-4f11-bf61-ab1addfeb933\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.507593 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh68p\" (UniqueName: \"kubernetes.io/projected/af83b033-4df7-4f11-bf61-ab1addfeb933-kube-api-access-vh68p\") pod \"af83b033-4df7-4f11-bf61-ab1addfeb933\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.507721 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-combined-ca-bundle\") pod \"af83b033-4df7-4f11-bf61-ab1addfeb933\" (UID: \"af83b033-4df7-4f11-bf61-ab1addfeb933\") " Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.519783 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af83b033-4df7-4f11-bf61-ab1addfeb933-kube-api-access-vh68p" (OuterVolumeSpecName: "kube-api-access-vh68p") pod "af83b033-4df7-4f11-bf61-ab1addfeb933" (UID: "af83b033-4df7-4f11-bf61-ab1addfeb933"). InnerVolumeSpecName "kube-api-access-vh68p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.524276 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-scripts" (OuterVolumeSpecName: "scripts") pod "af83b033-4df7-4f11-bf61-ab1addfeb933" (UID: "af83b033-4df7-4f11-bf61-ab1addfeb933"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.544079 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af83b033-4df7-4f11-bf61-ab1addfeb933" (UID: "af83b033-4df7-4f11-bf61-ab1addfeb933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.571412 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-config-data" (OuterVolumeSpecName: "config-data") pod "af83b033-4df7-4f11-bf61-ab1addfeb933" (UID: "af83b033-4df7-4f11-bf61-ab1addfeb933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.609062 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.609097 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.609109 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh68p\" (UniqueName: \"kubernetes.io/projected/af83b033-4df7-4f11-bf61-ab1addfeb933-kube-api-access-vh68p\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:34 crc kubenswrapper[4763]: I1201 09:36:34.609122 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af83b033-4df7-4f11-bf61-ab1addfeb933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.111283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6nsvb" event={"ID":"af83b033-4df7-4f11-bf61-ab1addfeb933","Type":"ContainerDied","Data":"1bebd3731d386b043fed6f9fb2f609afd89e2874d1b64d9a7177c4d1efac2e18"} Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.111517 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bebd3731d386b043fed6f9fb2f609afd89e2874d1b64d9a7177c4d1efac2e18" Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.111383 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6nsvb" Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.260544 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.263444 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.267663 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.297576 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.298037 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerName="nova-api-log" containerID="cri-o://43db7f4282f3c4f7cff725f26f345432c8c6593fbfb96944b1b4e284cfea8d7f" gracePeriod=30 Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.298102 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerName="nova-api-api" containerID="cri-o://dd9c61740051e58d77cdcfab86e5eaa3b2d704cafb181d8664e8da2be0498b42" gracePeriod=30 Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.327596 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.327874 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fb4ca475-898b-4bb6-a249-8bc297417775" containerName="nova-scheduler-scheduler" containerID="cri-o://8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" gracePeriod=30 Dec 01 09:36:35 crc kubenswrapper[4763]: I1201 09:36:35.397182 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:35 crc kubenswrapper[4763]: E1201 09:36:35.960510 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:36:35 crc kubenswrapper[4763]: E1201 09:36:35.965621 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:36:35 crc kubenswrapper[4763]: E1201 09:36:35.967050 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:36:35 crc kubenswrapper[4763]: E1201 09:36:35.967079 4763 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fb4ca475-898b-4bb6-a249-8bc297417775" containerName="nova-scheduler-scheduler" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.146282 4763 generic.go:334] "Generic (PLEG): container finished" podID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerID="dd9c61740051e58d77cdcfab86e5eaa3b2d704cafb181d8664e8da2be0498b42" exitCode=0 Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.146320 4763 generic.go:334] "Generic (PLEG): container finished" podID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerID="43db7f4282f3c4f7cff725f26f345432c8c6593fbfb96944b1b4e284cfea8d7f" exitCode=143 Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.146546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"664c10fa-d5c1-4e90-80f3-9933f73a4db3","Type":"ContainerDied","Data":"dd9c61740051e58d77cdcfab86e5eaa3b2d704cafb181d8664e8da2be0498b42"} Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.146578 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"664c10fa-d5c1-4e90-80f3-9933f73a4db3","Type":"ContainerDied","Data":"43db7f4282f3c4f7cff725f26f345432c8c6593fbfb96944b1b4e284cfea8d7f"} Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.146590 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"664c10fa-d5c1-4e90-80f3-9933f73a4db3","Type":"ContainerDied","Data":"68d5ccd3f5adad600583b674019e625f7e25bc7b3c97db5ad50da36d615564e2"} Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.146603 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d5ccd3f5adad600583b674019e625f7e25bc7b3c97db5ad50da36d615564e2" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.154724 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.211678 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.259839 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtt4c\" (UniqueName: \"kubernetes.io/projected/664c10fa-d5c1-4e90-80f3-9933f73a4db3-kube-api-access-wtt4c\") pod \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.259934 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-public-tls-certs\") pod \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.260032 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-config-data\") pod \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.260110 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/664c10fa-d5c1-4e90-80f3-9933f73a4db3-logs\") pod \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.260140 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-combined-ca-bundle\") pod \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.260175 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-internal-tls-certs\") pod \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\" (UID: \"664c10fa-d5c1-4e90-80f3-9933f73a4db3\") " Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.262123 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/664c10fa-d5c1-4e90-80f3-9933f73a4db3-logs" (OuterVolumeSpecName: "logs") pod "664c10fa-d5c1-4e90-80f3-9933f73a4db3" (UID: "664c10fa-d5c1-4e90-80f3-9933f73a4db3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.279984 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664c10fa-d5c1-4e90-80f3-9933f73a4db3-kube-api-access-wtt4c" (OuterVolumeSpecName: "kube-api-access-wtt4c") pod "664c10fa-d5c1-4e90-80f3-9933f73a4db3" (UID: "664c10fa-d5c1-4e90-80f3-9933f73a4db3"). InnerVolumeSpecName "kube-api-access-wtt4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.347811 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-config-data" (OuterVolumeSpecName: "config-data") pod "664c10fa-d5c1-4e90-80f3-9933f73a4db3" (UID: "664c10fa-d5c1-4e90-80f3-9933f73a4db3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.348503 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "664c10fa-d5c1-4e90-80f3-9933f73a4db3" (UID: "664c10fa-d5c1-4e90-80f3-9933f73a4db3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.354646 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "664c10fa-d5c1-4e90-80f3-9933f73a4db3" (UID: "664c10fa-d5c1-4e90-80f3-9933f73a4db3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.364409 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtt4c\" (UniqueName: \"kubernetes.io/projected/664c10fa-d5c1-4e90-80f3-9933f73a4db3-kube-api-access-wtt4c\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.364446 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.364469 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.364478 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/664c10fa-d5c1-4e90-80f3-9933f73a4db3-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.364490 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.384651 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "664c10fa-d5c1-4e90-80f3-9933f73a4db3" (UID: "664c10fa-d5c1-4e90-80f3-9933f73a4db3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:36 crc kubenswrapper[4763]: I1201 09:36:36.465534 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/664c10fa-d5c1-4e90-80f3-9933f73a4db3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.155334 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.155490 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-log" containerID="cri-o://1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f" gracePeriod=30 Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.155533 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-metadata" containerID="cri-o://6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab" gracePeriod=30 Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.184361 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.192729 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.211982 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:37 crc kubenswrapper[4763]: E1201 09:36:37.212336 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" containerName="dnsmasq-dns" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212351 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" containerName="dnsmasq-dns" Dec 01 09:36:37 crc kubenswrapper[4763]: E1201 09:36:37.212362 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerName="nova-api-log" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212367 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerName="nova-api-log" Dec 01 09:36:37 crc kubenswrapper[4763]: E1201 09:36:37.212379 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerName="nova-api-api" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212385 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerName="nova-api-api" Dec 01 09:36:37 crc kubenswrapper[4763]: E1201 09:36:37.212398 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" containerName="init" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212404 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" containerName="init" Dec 01 09:36:37 crc kubenswrapper[4763]: E1201 09:36:37.212418 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af83b033-4df7-4f11-bf61-ab1addfeb933" containerName="nova-manage" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212423 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="af83b033-4df7-4f11-bf61-ab1addfeb933" containerName="nova-manage" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212599 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d62f8f-38b1-4b3b-9d44-1dbde461ca9e" containerName="dnsmasq-dns" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212612 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerName="nova-api-log" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212621 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" containerName="nova-api-api" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.212641 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="af83b033-4df7-4f11-bf61-ab1addfeb933" containerName="nova-manage" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.213486 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.215796 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.216097 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.216301 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.236502 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.277546 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98f6n\" (UniqueName: \"kubernetes.io/projected/5c652d84-294e-4f79-bbcd-37fca6657cd6-kube-api-access-98f6n\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.277727 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.277771 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.277866 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.277903 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c652d84-294e-4f79-bbcd-37fca6657cd6-logs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.277929 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-config-data\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.379499 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.380185 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c652d84-294e-4f79-bbcd-37fca6657cd6-logs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.380616 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c652d84-294e-4f79-bbcd-37fca6657cd6-logs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.380675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-config-data\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.380701 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98f6n\" (UniqueName: \"kubernetes.io/projected/5c652d84-294e-4f79-bbcd-37fca6657cd6-kube-api-access-98f6n\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.380785 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.380829 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.383673 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-config-data\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.384500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.384960 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-public-tls-certs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.385009 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c652d84-294e-4f79-bbcd-37fca6657cd6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.400118 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98f6n\" (UniqueName: \"kubernetes.io/projected/5c652d84-294e-4f79-bbcd-37fca6657cd6-kube-api-access-98f6n\") pod \"nova-api-0\" (UID: \"5c652d84-294e-4f79-bbcd-37fca6657cd6\") " pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.531121 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 01 09:36:37 crc kubenswrapper[4763]: W1201 09:36:37.982947 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c652d84_294e_4f79_bbcd_37fca6657cd6.slice/crio-2da284895800e9ef0b3c3b34dad202ee95aa6213ec9730fe66b7eaecb90ab1ce WatchSource:0}: Error finding container 2da284895800e9ef0b3c3b34dad202ee95aa6213ec9730fe66b7eaecb90ab1ce: Status 404 returned error can't find the container with id 2da284895800e9ef0b3c3b34dad202ee95aa6213ec9730fe66b7eaecb90ab1ce Dec 01 09:36:37 crc kubenswrapper[4763]: I1201 09:36:37.984351 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 01 09:36:38 crc kubenswrapper[4763]: I1201 09:36:38.166500 4763 generic.go:334] "Generic (PLEG): container finished" podID="80db5535-a70c-4d66-9946-18192483b360" containerID="1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f" exitCode=143 Dec 01 09:36:38 crc kubenswrapper[4763]: I1201 09:36:38.167373 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80db5535-a70c-4d66-9946-18192483b360","Type":"ContainerDied","Data":"1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f"} Dec 01 09:36:38 crc kubenswrapper[4763]: I1201 09:36:38.168721 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c652d84-294e-4f79-bbcd-37fca6657cd6","Type":"ContainerStarted","Data":"2da284895800e9ef0b3c3b34dad202ee95aa6213ec9730fe66b7eaecb90ab1ce"} Dec 01 09:36:39 crc kubenswrapper[4763]: I1201 09:36:39.015128 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664c10fa-d5c1-4e90-80f3-9933f73a4db3" path="/var/lib/kubelet/pods/664c10fa-d5c1-4e90-80f3-9933f73a4db3/volumes" Dec 01 09:36:39 crc kubenswrapper[4763]: I1201 09:36:39.177204 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c652d84-294e-4f79-bbcd-37fca6657cd6","Type":"ContainerStarted","Data":"0940d7f64d68d4cb678e0ddb1eefd60a9135b119a8a6fe821f1a45bee002a5ac"} Dec 01 09:36:39 crc kubenswrapper[4763]: I1201 09:36:39.177253 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5c652d84-294e-4f79-bbcd-37fca6657cd6","Type":"ContainerStarted","Data":"1aa026cce1e1274e24a4aa69dc673b1bd1f43568c10246853bb930b534da93aa"} Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.302626 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.180:8775/\": read tcp 10.217.0.2:36062->10.217.0.180:8775: read: connection reset by peer" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.302732 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.180:8775/\": read tcp 10.217.0.2:36052->10.217.0.180:8775: read: connection reset by peer" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.816219 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.848072 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80db5535-a70c-4d66-9946-18192483b360-logs\") pod \"80db5535-a70c-4d66-9946-18192483b360\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.848141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-nova-metadata-tls-certs\") pod \"80db5535-a70c-4d66-9946-18192483b360\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.848191 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-combined-ca-bundle\") pod \"80db5535-a70c-4d66-9946-18192483b360\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.848247 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qrkt\" (UniqueName: \"kubernetes.io/projected/80db5535-a70c-4d66-9946-18192483b360-kube-api-access-7qrkt\") pod \"80db5535-a70c-4d66-9946-18192483b360\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.848267 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-config-data\") pod \"80db5535-a70c-4d66-9946-18192483b360\" (UID: \"80db5535-a70c-4d66-9946-18192483b360\") " Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.848957 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.848936882 podStartE2EDuration="3.848936882s" podCreationTimestamp="2025-12-01 09:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:36:39.200005305 +0000 UTC m=+1316.468654083" watchObservedRunningTime="2025-12-01 09:36:40.848936882 +0000 UTC m=+1318.117585650" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.852698 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80db5535-a70c-4d66-9946-18192483b360-logs" (OuterVolumeSpecName: "logs") pod "80db5535-a70c-4d66-9946-18192483b360" (UID: "80db5535-a70c-4d66-9946-18192483b360"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.856838 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80db5535-a70c-4d66-9946-18192483b360-kube-api-access-7qrkt" (OuterVolumeSpecName: "kube-api-access-7qrkt") pod "80db5535-a70c-4d66-9946-18192483b360" (UID: "80db5535-a70c-4d66-9946-18192483b360"). InnerVolumeSpecName "kube-api-access-7qrkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.900363 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80db5535-a70c-4d66-9946-18192483b360" (UID: "80db5535-a70c-4d66-9946-18192483b360"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.944673 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-config-data" (OuterVolumeSpecName: "config-data") pod "80db5535-a70c-4d66-9946-18192483b360" (UID: "80db5535-a70c-4d66-9946-18192483b360"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.946235 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "80db5535-a70c-4d66-9946-18192483b360" (UID: "80db5535-a70c-4d66-9946-18192483b360"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.950819 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.950924 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qrkt\" (UniqueName: \"kubernetes.io/projected/80db5535-a70c-4d66-9946-18192483b360-kube-api-access-7qrkt\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.951013 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.951079 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80db5535-a70c-4d66-9946-18192483b360-logs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.951133 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80db5535-a70c-4d66-9946-18192483b360-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:40 crc kubenswrapper[4763]: E1201 09:36:40.957882 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c is running failed: container process not found" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:36:40 crc kubenswrapper[4763]: E1201 09:36:40.958852 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c is running failed: container process not found" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:36:40 crc kubenswrapper[4763]: E1201 09:36:40.959367 4763 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c is running failed: container process not found" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 01 09:36:40 crc kubenswrapper[4763]: E1201 09:36:40.959481 4763 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fb4ca475-898b-4bb6-a249-8bc297417775" containerName="nova-scheduler-scheduler" Dec 01 09:36:40 crc kubenswrapper[4763]: I1201 09:36:40.989086 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.051787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj2gn\" (UniqueName: \"kubernetes.io/projected/fb4ca475-898b-4bb6-a249-8bc297417775-kube-api-access-kj2gn\") pod \"fb4ca475-898b-4bb6-a249-8bc297417775\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.052159 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-combined-ca-bundle\") pod \"fb4ca475-898b-4bb6-a249-8bc297417775\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.052266 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-config-data\") pod \"fb4ca475-898b-4bb6-a249-8bc297417775\" (UID: \"fb4ca475-898b-4bb6-a249-8bc297417775\") " Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.061363 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb4ca475-898b-4bb6-a249-8bc297417775-kube-api-access-kj2gn" (OuterVolumeSpecName: "kube-api-access-kj2gn") pod "fb4ca475-898b-4bb6-a249-8bc297417775" (UID: "fb4ca475-898b-4bb6-a249-8bc297417775"). InnerVolumeSpecName "kube-api-access-kj2gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.077513 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-config-data" (OuterVolumeSpecName: "config-data") pod "fb4ca475-898b-4bb6-a249-8bc297417775" (UID: "fb4ca475-898b-4bb6-a249-8bc297417775"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.080025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb4ca475-898b-4bb6-a249-8bc297417775" (UID: "fb4ca475-898b-4bb6-a249-8bc297417775"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.153570 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.153598 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb4ca475-898b-4bb6-a249-8bc297417775-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.153608 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj2gn\" (UniqueName: \"kubernetes.io/projected/fb4ca475-898b-4bb6-a249-8bc297417775-kube-api-access-kj2gn\") on node \"crc\" DevicePath \"\"" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.199157 4763 generic.go:334] "Generic (PLEG): container finished" podID="80db5535-a70c-4d66-9946-18192483b360" containerID="6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab" exitCode=0 Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.199226 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.199254 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80db5535-a70c-4d66-9946-18192483b360","Type":"ContainerDied","Data":"6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab"} Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.199709 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80db5535-a70c-4d66-9946-18192483b360","Type":"ContainerDied","Data":"eb6081a6ae3e26484c3485ed94c1150bf415db62506a87a4f8d0e01249f26d01"} Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.199738 4763 scope.go:117] "RemoveContainer" containerID="6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.201342 4763 generic.go:334] "Generic (PLEG): container finished" podID="fb4ca475-898b-4bb6-a249-8bc297417775" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" exitCode=0 Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.201367 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.201374 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb4ca475-898b-4bb6-a249-8bc297417775","Type":"ContainerDied","Data":"8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c"} Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.201395 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb4ca475-898b-4bb6-a249-8bc297417775","Type":"ContainerDied","Data":"8669eb82c395c4eed8064285231f37ab41de513b095853d170c2391fda101096"} Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.237966 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.238556 4763 scope.go:117] "RemoveContainer" containerID="1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.247813 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.259118 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.262211 4763 scope.go:117] "RemoveContainer" containerID="6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab" Dec 01 09:36:41 crc kubenswrapper[4763]: E1201 09:36:41.262756 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab\": container with ID starting with 6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab not found: ID does not exist" containerID="6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.262803 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab"} err="failed to get container status \"6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab\": rpc error: code = NotFound desc = could not find container \"6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab\": container with ID starting with 6b55ad482abe7437dab080435528dc861f186720de4171a96c579494d5e431ab not found: ID does not exist" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.262834 4763 scope.go:117] "RemoveContainer" containerID="1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f" Dec 01 09:36:41 crc kubenswrapper[4763]: E1201 09:36:41.263124 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f\": container with ID starting with 1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f not found: ID does not exist" containerID="1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.263158 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f"} err="failed to get container status \"1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f\": rpc error: code = NotFound desc = could not find container \"1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f\": container with ID starting with 1d4bb29e84eccc12bc28b384e6260192a000bc5002aab883273d43244441ff2f not found: ID does not exist" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.263178 4763 scope.go:117] "RemoveContainer" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.266610 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.276906 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:41 crc kubenswrapper[4763]: E1201 09:36:41.277338 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb4ca475-898b-4bb6-a249-8bc297417775" containerName="nova-scheduler-scheduler" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.277359 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4ca475-898b-4bb6-a249-8bc297417775" containerName="nova-scheduler-scheduler" Dec 01 09:36:41 crc kubenswrapper[4763]: E1201 09:36:41.277378 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-log" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.277386 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-log" Dec 01 09:36:41 crc kubenswrapper[4763]: E1201 09:36:41.277418 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-metadata" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.277425 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-metadata" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.277627 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-log" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.277648 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="80db5535-a70c-4d66-9946-18192483b360" containerName="nova-metadata-metadata" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.277659 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb4ca475-898b-4bb6-a249-8bc297417775" containerName="nova-scheduler-scheduler" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.278908 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.286422 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.286947 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.289347 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.295046 4763 scope.go:117] "RemoveContainer" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" Dec 01 09:36:41 crc kubenswrapper[4763]: E1201 09:36:41.295649 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c\": container with ID starting with 8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c not found: ID does not exist" containerID="8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.295688 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c"} err="failed to get container status \"8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c\": rpc error: code = NotFound desc = could not find container \"8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c\": container with ID starting with 8ed954355e7bb65f6cc709af9f858e617ce76981e3872b5079bf9c54a45cc66c not found: ID does not exist" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.305112 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.306342 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.308479 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.311631 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.356809 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.356856 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002374b8-d55c-4996-9fb9-0e4fc758dc7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.356905 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.356934 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsjdf\" (UniqueName: \"kubernetes.io/projected/002374b8-d55c-4996-9fb9-0e4fc758dc7f-kube-api-access-lsjdf\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.356972 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-logs\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.356994 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdft\" (UniqueName: \"kubernetes.io/projected/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-kube-api-access-pvdft\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.357013 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-config-data\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.357051 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002374b8-d55c-4996-9fb9-0e4fc758dc7f-config-data\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458440 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-logs\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458526 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdft\" (UniqueName: \"kubernetes.io/projected/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-kube-api-access-pvdft\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-config-data\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458598 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002374b8-d55c-4996-9fb9-0e4fc758dc7f-config-data\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458665 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458685 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002374b8-d55c-4996-9fb9-0e4fc758dc7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458734 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458761 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsjdf\" (UniqueName: \"kubernetes.io/projected/002374b8-d55c-4996-9fb9-0e4fc758dc7f-kube-api-access-lsjdf\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.458956 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-logs\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.465090 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.466568 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.468950 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-config-data\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.469505 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002374b8-d55c-4996-9fb9-0e4fc758dc7f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.482728 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002374b8-d55c-4996-9fb9-0e4fc758dc7f-config-data\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.483562 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdft\" (UniqueName: \"kubernetes.io/projected/b10d5a09-4c44-4fb9-bcc0-b04612dde39c-kube-api-access-pvdft\") pod \"nova-metadata-0\" (UID: \"b10d5a09-4c44-4fb9-bcc0-b04612dde39c\") " pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.488029 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsjdf\" (UniqueName: \"kubernetes.io/projected/002374b8-d55c-4996-9fb9-0e4fc758dc7f-kube-api-access-lsjdf\") pod \"nova-scheduler-0\" (UID: \"002374b8-d55c-4996-9fb9-0e4fc758dc7f\") " pod="openstack/nova-scheduler-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.596036 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 01 09:36:41 crc kubenswrapper[4763]: I1201 09:36:41.636304 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 01 09:36:42 crc kubenswrapper[4763]: W1201 09:36:42.145152 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb10d5a09_4c44_4fb9_bcc0_b04612dde39c.slice/crio-5c77ffb228d9f9ae6eb8a31203b9309f10f4c3f9f54bcd7fb266e606e7da803f WatchSource:0}: Error finding container 5c77ffb228d9f9ae6eb8a31203b9309f10f4c3f9f54bcd7fb266e606e7da803f: Status 404 returned error can't find the container with id 5c77ffb228d9f9ae6eb8a31203b9309f10f4c3f9f54bcd7fb266e606e7da803f Dec 01 09:36:42 crc kubenswrapper[4763]: I1201 09:36:42.145784 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 01 09:36:42 crc kubenswrapper[4763]: I1201 09:36:42.216478 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b10d5a09-4c44-4fb9-bcc0-b04612dde39c","Type":"ContainerStarted","Data":"5c77ffb228d9f9ae6eb8a31203b9309f10f4c3f9f54bcd7fb266e606e7da803f"} Dec 01 09:36:42 crc kubenswrapper[4763]: W1201 09:36:42.247997 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod002374b8_d55c_4996_9fb9_0e4fc758dc7f.slice/crio-19ea8510d890f9e2a29b5b2dcfd4c14b3f01e222b54267bb52c8a1cf0bd7674a WatchSource:0}: Error finding container 19ea8510d890f9e2a29b5b2dcfd4c14b3f01e222b54267bb52c8a1cf0bd7674a: Status 404 returned error can't find the container with id 19ea8510d890f9e2a29b5b2dcfd4c14b3f01e222b54267bb52c8a1cf0bd7674a Dec 01 09:36:42 crc kubenswrapper[4763]: I1201 09:36:42.261331 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 01 09:36:43 crc kubenswrapper[4763]: I1201 09:36:43.017151 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80db5535-a70c-4d66-9946-18192483b360" path="/var/lib/kubelet/pods/80db5535-a70c-4d66-9946-18192483b360/volumes" Dec 01 09:36:43 crc kubenswrapper[4763]: I1201 09:36:43.018600 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb4ca475-898b-4bb6-a249-8bc297417775" path="/var/lib/kubelet/pods/fb4ca475-898b-4bb6-a249-8bc297417775/volumes" Dec 01 09:36:43 crc kubenswrapper[4763]: I1201 09:36:43.230731 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b10d5a09-4c44-4fb9-bcc0-b04612dde39c","Type":"ContainerStarted","Data":"829810f75eebd1fe668bd060289c44e9bfb6f0bc6dab4c482237504cea33e040"} Dec 01 09:36:43 crc kubenswrapper[4763]: I1201 09:36:43.231009 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b10d5a09-4c44-4fb9-bcc0-b04612dde39c","Type":"ContainerStarted","Data":"38904b89e0f540ae105c18996d1627a42bba34520b8d97156ea7ed3aa80b6254"} Dec 01 09:36:43 crc kubenswrapper[4763]: I1201 09:36:43.232736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"002374b8-d55c-4996-9fb9-0e4fc758dc7f","Type":"ContainerStarted","Data":"83cafa9498291b3e44e36e03aa7d8606e95fc8384db3d591825e28e253752bf3"} Dec 01 09:36:43 crc kubenswrapper[4763]: I1201 09:36:43.232825 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"002374b8-d55c-4996-9fb9-0e4fc758dc7f","Type":"ContainerStarted","Data":"19ea8510d890f9e2a29b5b2dcfd4c14b3f01e222b54267bb52c8a1cf0bd7674a"} Dec 01 09:36:43 crc kubenswrapper[4763]: I1201 09:36:43.256580 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.256558334 podStartE2EDuration="2.256558334s" podCreationTimestamp="2025-12-01 09:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:36:43.247285219 +0000 UTC m=+1320.515933987" watchObservedRunningTime="2025-12-01 09:36:43.256558334 +0000 UTC m=+1320.525207122" Dec 01 09:36:43 crc kubenswrapper[4763]: I1201 09:36:43.271708 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.271689167 podStartE2EDuration="2.271689167s" podCreationTimestamp="2025-12-01 09:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:36:43.260410301 +0000 UTC m=+1320.529059069" watchObservedRunningTime="2025-12-01 09:36:43.271689167 +0000 UTC m=+1320.540337945" Dec 01 09:36:46 crc kubenswrapper[4763]: I1201 09:36:46.596117 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:36:46 crc kubenswrapper[4763]: I1201 09:36:46.596628 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 01 09:36:46 crc kubenswrapper[4763]: I1201 09:36:46.636888 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 01 09:36:47 crc kubenswrapper[4763]: I1201 09:36:47.531310 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:36:47 crc kubenswrapper[4763]: I1201 09:36:47.531643 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 01 09:36:48 crc kubenswrapper[4763]: I1201 09:36:48.545661 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5c652d84-294e-4f79-bbcd-37fca6657cd6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:36:48 crc kubenswrapper[4763]: I1201 09:36:48.545619 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5c652d84-294e-4f79-bbcd-37fca6657cd6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 01 09:36:51 crc kubenswrapper[4763]: I1201 09:36:51.596203 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 09:36:51 crc kubenswrapper[4763]: I1201 09:36:51.596559 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 01 09:36:51 crc kubenswrapper[4763]: I1201 09:36:51.637067 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 01 09:36:51 crc kubenswrapper[4763]: I1201 09:36:51.666613 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 01 09:36:52 crc kubenswrapper[4763]: I1201 09:36:52.340717 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 01 09:36:52 crc kubenswrapper[4763]: I1201 09:36:52.607730 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b10d5a09-4c44-4fb9-bcc0-b04612dde39c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:36:52 crc kubenswrapper[4763]: I1201 09:36:52.608075 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b10d5a09-4c44-4fb9-bcc0-b04612dde39c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 01 09:36:53 crc kubenswrapper[4763]: I1201 09:36:53.447555 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 09:36:57 crc kubenswrapper[4763]: I1201 09:36:57.538151 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 09:36:57 crc kubenswrapper[4763]: I1201 09:36:57.538702 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 01 09:36:57 crc kubenswrapper[4763]: I1201 09:36:57.539408 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 09:36:57 crc kubenswrapper[4763]: I1201 09:36:57.539429 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 01 09:36:57 crc kubenswrapper[4763]: I1201 09:36:57.544174 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 09:36:57 crc kubenswrapper[4763]: I1201 09:36:57.545198 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 01 09:37:01 crc kubenswrapper[4763]: I1201 09:37:01.603398 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 09:37:01 crc kubenswrapper[4763]: I1201 09:37:01.604255 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 01 09:37:01 crc kubenswrapper[4763]: I1201 09:37:01.614349 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 09:37:02 crc kubenswrapper[4763]: I1201 09:37:02.432734 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 01 09:37:11 crc kubenswrapper[4763]: I1201 09:37:11.079024 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:37:12 crc kubenswrapper[4763]: I1201 09:37:12.022295 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:37:16 crc kubenswrapper[4763]: I1201 09:37:16.176059 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="53cf9c04-a52d-4827-a700-98ca02183344" containerName="rabbitmq" containerID="cri-o://6421caf3cc59da633677f6a1a23a7dff3b63e33d2cf6cf34af4e51c833217b8c" gracePeriod=604796 Dec 01 09:37:16 crc kubenswrapper[4763]: I1201 09:37:16.361115 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="56f133f4-8bf0-4c02-add2-37f41b8904cc" containerName="rabbitmq" containerID="cri-o://87effba6752f4348dba8ec59ec9227f854967d19a27416c23ce446ec22ce32cc" gracePeriod=604795 Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.615435 4763 generic.go:334] "Generic (PLEG): container finished" podID="56f133f4-8bf0-4c02-add2-37f41b8904cc" containerID="87effba6752f4348dba8ec59ec9227f854967d19a27416c23ce446ec22ce32cc" exitCode=0 Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.615789 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56f133f4-8bf0-4c02-add2-37f41b8904cc","Type":"ContainerDied","Data":"87effba6752f4348dba8ec59ec9227f854967d19a27416c23ce446ec22ce32cc"} Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.618666 4763 generic.go:334] "Generic (PLEG): container finished" podID="53cf9c04-a52d-4827-a700-98ca02183344" containerID="6421caf3cc59da633677f6a1a23a7dff3b63e33d2cf6cf34af4e51c833217b8c" exitCode=0 Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.618693 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53cf9c04-a52d-4827-a700-98ca02183344","Type":"ContainerDied","Data":"6421caf3cc59da633677f6a1a23a7dff3b63e33d2cf6cf34af4e51c833217b8c"} Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.758295 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844022 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-tls\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844084 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-config-data\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844113 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxwvk\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-kube-api-access-lxwvk\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844183 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844212 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53cf9c04-a52d-4827-a700-98ca02183344-pod-info\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844240 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-confd\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844575 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-server-conf\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844653 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-plugins\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844743 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-plugins-conf\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844819 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-erlang-cookie\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.844846 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53cf9c04-a52d-4827-a700-98ca02183344-erlang-cookie-secret\") pod \"53cf9c04-a52d-4827-a700-98ca02183344\" (UID: \"53cf9c04-a52d-4827-a700-98ca02183344\") " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.845429 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.845753 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.845757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.846260 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.854792 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-kube-api-access-lxwvk" (OuterVolumeSpecName: "kube-api-access-lxwvk") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "kube-api-access-lxwvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.868595 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/53cf9c04-a52d-4827-a700-98ca02183344-pod-info" (OuterVolumeSpecName: "pod-info") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.872037 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.878194 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.883743 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53cf9c04-a52d-4827-a700-98ca02183344-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.911658 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-config-data" (OuterVolumeSpecName: "config-data") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.944751 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-server-conf" (OuterVolumeSpecName: "server-conf") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947037 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947076 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947089 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxwvk\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-kube-api-access-lxwvk\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947102 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53cf9c04-a52d-4827-a700-98ca02183344-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947132 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947141 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53cf9c04-a52d-4827-a700-98ca02183344-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947149 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947159 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.947168 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53cf9c04-a52d-4827-a700-98ca02183344-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:22 crc kubenswrapper[4763]: I1201 09:37:22.953919 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.002529 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.037260 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "53cf9c04-a52d-4827-a700-98ca02183344" (UID: "53cf9c04-a52d-4827-a700-98ca02183344"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050106 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56f133f4-8bf0-4c02-add2-37f41b8904cc-pod-info\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050179 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-plugins\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050242 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4rpg\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-kube-api-access-c4rpg\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050280 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56f133f4-8bf0-4c02-add2-37f41b8904cc-erlang-cookie-secret\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050333 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-erlang-cookie\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050401 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-server-conf\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050449 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-tls\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050490 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-confd\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050561 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050611 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-plugins-conf\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.050648 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-config-data\") pod \"56f133f4-8bf0-4c02-add2-37f41b8904cc\" (UID: \"56f133f4-8bf0-4c02-add2-37f41b8904cc\") " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.051093 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.051109 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53cf9c04-a52d-4827-a700-98ca02183344-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.052975 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.054372 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/56f133f4-8bf0-4c02-add2-37f41b8904cc-pod-info" (OuterVolumeSpecName: "pod-info") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.054980 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.056989 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.067418 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-kube-api-access-c4rpg" (OuterVolumeSpecName: "kube-api-access-c4rpg") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "kube-api-access-c4rpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.076082 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.078094 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f133f4-8bf0-4c02-add2-37f41b8904cc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.081763 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.120268 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-config-data" (OuterVolumeSpecName: "config-data") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.152902 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.152940 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.152978 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.152991 4763 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.153003 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.153013 4763 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/56f133f4-8bf0-4c02-add2-37f41b8904cc-pod-info\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.153022 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.153033 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4rpg\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-kube-api-access-c4rpg\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.153044 4763 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/56f133f4-8bf0-4c02-add2-37f41b8904cc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.181939 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.188982 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-server-conf" (OuterVolumeSpecName: "server-conf") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.231299 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "56f133f4-8bf0-4c02-add2-37f41b8904cc" (UID: "56f133f4-8bf0-4c02-add2-37f41b8904cc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.255008 4763 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/56f133f4-8bf0-4c02-add2-37f41b8904cc-server-conf\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.255045 4763 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/56f133f4-8bf0-4c02-add2-37f41b8904cc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.255057 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.638415 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.638422 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"56f133f4-8bf0-4c02-add2-37f41b8904cc","Type":"ContainerDied","Data":"43b26529fb8204e56a85168bbb47bbbf967a593161cdbda0b7f12effb921e118"} Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.638501 4763 scope.go:117] "RemoveContainer" containerID="87effba6752f4348dba8ec59ec9227f854967d19a27416c23ce446ec22ce32cc" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.646892 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"53cf9c04-a52d-4827-a700-98ca02183344","Type":"ContainerDied","Data":"f0e119b7582423098a7ffffb93f21127f86fdb3558bafdd852696d092acedfa6"} Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.646980 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.687796 4763 scope.go:117] "RemoveContainer" containerID="86983aba13148484a71a0d2d268e9d207c4ea276886647d390e4527e620f1a60" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.740226 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.743505 4763 scope.go:117] "RemoveContainer" containerID="6421caf3cc59da633677f6a1a23a7dff3b63e33d2cf6cf34af4e51c833217b8c" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.761224 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.812021 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:37:23 crc kubenswrapper[4763]: E1201 09:37:23.812428 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f133f4-8bf0-4c02-add2-37f41b8904cc" containerName="rabbitmq" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.812441 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f133f4-8bf0-4c02-add2-37f41b8904cc" containerName="rabbitmq" Dec 01 09:37:23 crc kubenswrapper[4763]: E1201 09:37:23.812484 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f133f4-8bf0-4c02-add2-37f41b8904cc" containerName="setup-container" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.812490 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f133f4-8bf0-4c02-add2-37f41b8904cc" containerName="setup-container" Dec 01 09:37:23 crc kubenswrapper[4763]: E1201 09:37:23.812501 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cf9c04-a52d-4827-a700-98ca02183344" containerName="setup-container" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.812507 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cf9c04-a52d-4827-a700-98ca02183344" containerName="setup-container" Dec 01 09:37:23 crc kubenswrapper[4763]: E1201 09:37:23.812513 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cf9c04-a52d-4827-a700-98ca02183344" containerName="rabbitmq" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.812519 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cf9c04-a52d-4827-a700-98ca02183344" containerName="rabbitmq" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.812696 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f133f4-8bf0-4c02-add2-37f41b8904cc" containerName="rabbitmq" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.812720 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cf9c04-a52d-4827-a700-98ca02183344" containerName="rabbitmq" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.813754 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.815600 4763 scope.go:117] "RemoveContainer" containerID="8c1d98881c3bc1622990c364f450de88e45d211ceb7dc05c3517a65a63a82b89" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.817601 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.817884 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.819792 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.820149 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.820393 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ncncw" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.820568 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.825969 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.829010 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.835048 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.860546 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.872621 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.874280 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.876324 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.880638 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.881688 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5tks6" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.881803 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.881901 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.883225 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.883430 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.883699 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.885304 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.885443 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6051a720-a09e-4c11-a9c4-465be3730f65-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.885609 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.885835 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.885965 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.886090 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.886537 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz6tc\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-kube-api-access-tz6tc\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.886725 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-config-data\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.886961 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.887135 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6051a720-a09e-4c11-a9c4-465be3730f65-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.887724 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.991838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-config-data\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.990611 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-config-data\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.991948 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.991995 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992016 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992130 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992161 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkxwv\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-kube-api-access-xkxwv\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992255 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6051a720-a09e-4c11-a9c4-465be3730f65-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992352 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992387 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992409 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992445 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992493 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6051a720-a09e-4c11-a9c4-465be3730f65-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992525 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992583 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992601 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992622 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992646 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992664 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992681 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992696 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992715 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6tc\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-kube-api-access-tz6tc\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.992735 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.993166 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.993694 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.994304 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.996275 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6051a720-a09e-4c11-a9c4-465be3730f65-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.997849 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:23 crc kubenswrapper[4763]: I1201 09:37:23.997986 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6051a720-a09e-4c11-a9c4-465be3730f65-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.000031 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6051a720-a09e-4c11-a9c4-465be3730f65-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.001691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.014306 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.015970 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz6tc\" (UniqueName: \"kubernetes.io/projected/6051a720-a09e-4c11-a9c4-465be3730f65-kube-api-access-tz6tc\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.028165 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6051a720-a09e-4c11-a9c4-465be3730f65\") " pod="openstack/rabbitmq-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094250 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094615 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094637 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094658 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkxwv\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-kube-api-access-xkxwv\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094682 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094706 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094780 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094803 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094833 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094854 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.094881 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.095512 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.095900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.096430 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.096559 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.096743 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.097579 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.099353 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.100167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.101714 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.110075 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.116799 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkxwv\" (UniqueName: \"kubernetes.io/projected/6d10e5ae-f63a-4bdf-b3f5-2f99e6856799-kube-api-access-xkxwv\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.129787 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799\") " pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.156317 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.211049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.622108 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 01 09:37:24 crc kubenswrapper[4763]: W1201 09:37:24.633542 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6051a720_a09e_4c11_a9c4_465be3730f65.slice/crio-c73731a07cb26f403768d72aa28c7c1b486f8feb422e7eb89bc6c2a551bd72d0 WatchSource:0}: Error finding container c73731a07cb26f403768d72aa28c7c1b486f8feb422e7eb89bc6c2a551bd72d0: Status 404 returned error can't find the container with id c73731a07cb26f403768d72aa28c7c1b486f8feb422e7eb89bc6c2a551bd72d0 Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.666872 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6051a720-a09e-4c11-a9c4-465be3730f65","Type":"ContainerStarted","Data":"c73731a07cb26f403768d72aa28c7c1b486f8feb422e7eb89bc6c2a551bd72d0"} Dec 01 09:37:24 crc kubenswrapper[4763]: I1201 09:37:24.741784 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 01 09:37:24 crc kubenswrapper[4763]: W1201 09:37:24.749975 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d10e5ae_f63a_4bdf_b3f5_2f99e6856799.slice/crio-172f16f1286a7be2a575c394340ca0684c2097fad7bf592d7f8a235312f58834 WatchSource:0}: Error finding container 172f16f1286a7be2a575c394340ca0684c2097fad7bf592d7f8a235312f58834: Status 404 returned error can't find the container with id 172f16f1286a7be2a575c394340ca0684c2097fad7bf592d7f8a235312f58834 Dec 01 09:37:25 crc kubenswrapper[4763]: I1201 09:37:25.006365 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cf9c04-a52d-4827-a700-98ca02183344" path="/var/lib/kubelet/pods/53cf9c04-a52d-4827-a700-98ca02183344/volumes" Dec 01 09:37:25 crc kubenswrapper[4763]: I1201 09:37:25.007928 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f133f4-8bf0-4c02-add2-37f41b8904cc" path="/var/lib/kubelet/pods/56f133f4-8bf0-4c02-add2-37f41b8904cc/volumes" Dec 01 09:37:25 crc kubenswrapper[4763]: I1201 09:37:25.678049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799","Type":"ContainerStarted","Data":"172f16f1286a7be2a575c394340ca0684c2097fad7bf592d7f8a235312f58834"} Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.701965 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-5x72p"] Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.703718 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.708665 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.714894 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799","Type":"ContainerStarted","Data":"7bcb96e0f65f72c9fa4bc1e9a794642dea1f36dc1f51e3df7260d3a6a393f27d"} Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.729698 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6051a720-a09e-4c11-a9c4-465be3730f65","Type":"ContainerStarted","Data":"fa1c6f65ae42eec43a106d1d83e65265137b61fb65a660b38dc27c6bb32f6d80"} Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.732253 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-5x72p"] Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.745227 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.745279 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq6wt\" (UniqueName: \"kubernetes.io/projected/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-kube-api-access-xq6wt\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.745302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.745348 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.745433 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-config\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.745497 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.847386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.847542 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.847570 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq6wt\" (UniqueName: \"kubernetes.io/projected/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-kube-api-access-xq6wt\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.847600 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.847650 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.847725 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-config\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.848835 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-dns-svc\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.849070 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.849527 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.849719 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-config\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.849898 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:26 crc kubenswrapper[4763]: I1201 09:37:26.872251 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq6wt\" (UniqueName: \"kubernetes.io/projected/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-kube-api-access-xq6wt\") pod \"dnsmasq-dns-578b8d767c-5x72p\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:27 crc kubenswrapper[4763]: I1201 09:37:27.026440 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:27 crc kubenswrapper[4763]: I1201 09:37:27.480886 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-5x72p"] Dec 01 09:37:27 crc kubenswrapper[4763]: I1201 09:37:27.740098 4763 generic.go:334] "Generic (PLEG): container finished" podID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" containerID="8bbfae72a3f67d9bd8166301e475ba4f3c723d2fbc74ecd2757b42dea1339f4a" exitCode=0 Dec 01 09:37:27 crc kubenswrapper[4763]: I1201 09:37:27.740185 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" event={"ID":"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8","Type":"ContainerDied","Data":"8bbfae72a3f67d9bd8166301e475ba4f3c723d2fbc74ecd2757b42dea1339f4a"} Dec 01 09:37:27 crc kubenswrapper[4763]: I1201 09:37:27.741438 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" event={"ID":"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8","Type":"ContainerStarted","Data":"01bb8aa599e0293722d56b759f8b6f3e192a7e8a933538ede188a21b328a9efe"} Dec 01 09:37:28 crc kubenswrapper[4763]: I1201 09:37:28.759370 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" event={"ID":"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8","Type":"ContainerStarted","Data":"ec393e866d59a6eaa5d62403cc35e5d5226aab9cc5202f1b283c0737ae5e22f3"} Dec 01 09:37:28 crc kubenswrapper[4763]: I1201 09:37:28.760051 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:28 crc kubenswrapper[4763]: I1201 09:37:28.793639 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" podStartSLOduration=2.793617919 podStartE2EDuration="2.793617919s" podCreationTimestamp="2025-12-01 09:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:37:28.784810676 +0000 UTC m=+1366.053459444" watchObservedRunningTime="2025-12-01 09:37:28.793617919 +0000 UTC m=+1366.062266687" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.028655 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.099270 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-752rw"] Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.099607 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" podUID="2645413f-063f-475a-860d-1876987608fc" containerName="dnsmasq-dns" containerID="cri-o://e05f333bcc113aef0126f257b2ca7bd48d1d5c57a7d914764929d1032c3b8f1a" gracePeriod=10 Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.244431 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-nksxv"] Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.248081 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.272143 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-nksxv"] Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.345564 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-config\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.345631 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-openstack-edpm-ipam\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.345655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tfbm\" (UniqueName: \"kubernetes.io/projected/a50257f6-6461-4ee3-b40b-4f56fe98dfad-kube-api-access-2tfbm\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.345849 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-nb\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.345909 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-sb\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.345986 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-dns-svc\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.448214 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-openstack-edpm-ipam\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.448267 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tfbm\" (UniqueName: \"kubernetes.io/projected/a50257f6-6461-4ee3-b40b-4f56fe98dfad-kube-api-access-2tfbm\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.448321 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-nb\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.448342 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-sb\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.448377 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-dns-svc\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.448465 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-config\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.449854 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-nb\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.449933 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-openstack-edpm-ipam\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.449961 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-dns-svc\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.450080 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-config\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.450513 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-sb\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.476475 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tfbm\" (UniqueName: \"kubernetes.io/projected/a50257f6-6461-4ee3-b40b-4f56fe98dfad-kube-api-access-2tfbm\") pod \"dnsmasq-dns-667ff9c869-nksxv\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.592588 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.840824 4763 generic.go:334] "Generic (PLEG): container finished" podID="2645413f-063f-475a-860d-1876987608fc" containerID="e05f333bcc113aef0126f257b2ca7bd48d1d5c57a7d914764929d1032c3b8f1a" exitCode=0 Dec 01 09:37:32 crc kubenswrapper[4763]: I1201 09:37:32.841170 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" event={"ID":"2645413f-063f-475a-860d-1876987608fc","Type":"ContainerDied","Data":"e05f333bcc113aef0126f257b2ca7bd48d1d5c57a7d914764929d1032c3b8f1a"} Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.079948 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-nksxv"] Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.128144 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.266258 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-sb\") pod \"2645413f-063f-475a-860d-1876987608fc\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.266311 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6d5q\" (UniqueName: \"kubernetes.io/projected/2645413f-063f-475a-860d-1876987608fc-kube-api-access-z6d5q\") pod \"2645413f-063f-475a-860d-1876987608fc\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.266385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-config\") pod \"2645413f-063f-475a-860d-1876987608fc\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.266576 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-dns-svc\") pod \"2645413f-063f-475a-860d-1876987608fc\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.266599 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-nb\") pod \"2645413f-063f-475a-860d-1876987608fc\" (UID: \"2645413f-063f-475a-860d-1876987608fc\") " Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.276584 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2645413f-063f-475a-860d-1876987608fc-kube-api-access-z6d5q" (OuterVolumeSpecName: "kube-api-access-z6d5q") pod "2645413f-063f-475a-860d-1876987608fc" (UID: "2645413f-063f-475a-860d-1876987608fc"). InnerVolumeSpecName "kube-api-access-z6d5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.317954 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2645413f-063f-475a-860d-1876987608fc" (UID: "2645413f-063f-475a-860d-1876987608fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.325601 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2645413f-063f-475a-860d-1876987608fc" (UID: "2645413f-063f-475a-860d-1876987608fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.334089 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2645413f-063f-475a-860d-1876987608fc" (UID: "2645413f-063f-475a-860d-1876987608fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.348822 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-config" (OuterVolumeSpecName: "config") pod "2645413f-063f-475a-860d-1876987608fc" (UID: "2645413f-063f-475a-860d-1876987608fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.373808 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.373841 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.373851 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.373860 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6d5q\" (UniqueName: \"kubernetes.io/projected/2645413f-063f-475a-860d-1876987608fc-kube-api-access-z6d5q\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.373869 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2645413f-063f-475a-860d-1876987608fc-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.852516 4763 generic.go:334] "Generic (PLEG): container finished" podID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" containerID="d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad" exitCode=0 Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.852556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" event={"ID":"a50257f6-6461-4ee3-b40b-4f56fe98dfad","Type":"ContainerDied","Data":"d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad"} Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.852594 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" event={"ID":"a50257f6-6461-4ee3-b40b-4f56fe98dfad","Type":"ContainerStarted","Data":"74ea43238c9e8e7b43cb4cfc8129870192e0570c1801205226fa6b6f14f41d28"} Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.856871 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" event={"ID":"2645413f-063f-475a-860d-1876987608fc","Type":"ContainerDied","Data":"80abce437aed7b3418d940f2bc405af5804e6de83390fd275b37984066b06738"} Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.856926 4763 scope.go:117] "RemoveContainer" containerID="e05f333bcc113aef0126f257b2ca7bd48d1d5c57a7d914764929d1032c3b8f1a" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.856963 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-752rw" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.932738 4763 scope.go:117] "RemoveContainer" containerID="56c775aeb8e9ca3117c6d30148a8320316881e3d9174537a3d54c05d55b758b2" Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.944487 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-752rw"] Dec 01 09:37:33 crc kubenswrapper[4763]: I1201 09:37:33.952259 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-752rw"] Dec 01 09:37:34 crc kubenswrapper[4763]: I1201 09:37:34.865838 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" event={"ID":"a50257f6-6461-4ee3-b40b-4f56fe98dfad","Type":"ContainerStarted","Data":"d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267"} Dec 01 09:37:34 crc kubenswrapper[4763]: I1201 09:37:34.866189 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:35 crc kubenswrapper[4763]: I1201 09:37:35.004877 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2645413f-063f-475a-860d-1876987608fc" path="/var/lib/kubelet/pods/2645413f-063f-475a-860d-1876987608fc/volumes" Dec 01 09:37:42 crc kubenswrapper[4763]: I1201 09:37:42.594660 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 09:37:42 crc kubenswrapper[4763]: I1201 09:37:42.626654 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" podStartSLOduration=10.626632789 podStartE2EDuration="10.626632789s" podCreationTimestamp="2025-12-01 09:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:37:34.900677701 +0000 UTC m=+1372.169326479" watchObservedRunningTime="2025-12-01 09:37:42.626632789 +0000 UTC m=+1379.895281567" Dec 01 09:37:42 crc kubenswrapper[4763]: I1201 09:37:42.668511 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-5x72p"] Dec 01 09:37:42 crc kubenswrapper[4763]: I1201 09:37:42.668853 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" podUID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" containerName="dnsmasq-dns" containerID="cri-o://ec393e866d59a6eaa5d62403cc35e5d5226aab9cc5202f1b283c0737ae5e22f3" gracePeriod=10 Dec 01 09:37:42 crc kubenswrapper[4763]: I1201 09:37:42.975068 4763 generic.go:334] "Generic (PLEG): container finished" podID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" containerID="ec393e866d59a6eaa5d62403cc35e5d5226aab9cc5202f1b283c0737ae5e22f3" exitCode=0 Dec 01 09:37:42 crc kubenswrapper[4763]: I1201 09:37:42.975669 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" event={"ID":"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8","Type":"ContainerDied","Data":"ec393e866d59a6eaa5d62403cc35e5d5226aab9cc5202f1b283c0737ae5e22f3"} Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.680490 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.775475 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-openstack-edpm-ipam\") pod \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.775583 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-config\") pod \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.775609 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-dns-svc\") pod \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.775682 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq6wt\" (UniqueName: \"kubernetes.io/projected/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-kube-api-access-xq6wt\") pod \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.775794 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-sb\") pod \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.776003 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-nb\") pod \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\" (UID: \"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8\") " Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.799335 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-kube-api-access-xq6wt" (OuterVolumeSpecName: "kube-api-access-xq6wt") pod "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" (UID: "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8"). InnerVolumeSpecName "kube-api-access-xq6wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.850671 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-config" (OuterVolumeSpecName: "config") pod "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" (UID: "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.860414 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" (UID: "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.861219 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" (UID: "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.869283 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" (UID: "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.879130 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.879161 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-config\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.879171 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.879181 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq6wt\" (UniqueName: \"kubernetes.io/projected/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-kube-api-access-xq6wt\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.879214 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.886281 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" (UID: "7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.983839 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.985854 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" event={"ID":"7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8","Type":"ContainerDied","Data":"01bb8aa599e0293722d56b759f8b6f3e192a7e8a933538ede188a21b328a9efe"} Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.985902 4763 scope.go:117] "RemoveContainer" containerID="ec393e866d59a6eaa5d62403cc35e5d5226aab9cc5202f1b283c0737ae5e22f3" Dec 01 09:37:43 crc kubenswrapper[4763]: I1201 09:37:43.986007 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-5x72p" Dec 01 09:37:44 crc kubenswrapper[4763]: I1201 09:37:44.007799 4763 scope.go:117] "RemoveContainer" containerID="8bbfae72a3f67d9bd8166301e475ba4f3c723d2fbc74ecd2757b42dea1339f4a" Dec 01 09:37:44 crc kubenswrapper[4763]: I1201 09:37:44.023729 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-5x72p"] Dec 01 09:37:44 crc kubenswrapper[4763]: I1201 09:37:44.032776 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-5x72p"] Dec 01 09:37:45 crc kubenswrapper[4763]: I1201 09:37:45.013610 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" path="/var/lib/kubelet/pods/7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8/volumes" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.463033 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d"] Dec 01 09:37:48 crc kubenswrapper[4763]: E1201 09:37:48.464071 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2645413f-063f-475a-860d-1876987608fc" containerName="init" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.464091 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2645413f-063f-475a-860d-1876987608fc" containerName="init" Dec 01 09:37:48 crc kubenswrapper[4763]: E1201 09:37:48.464111 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" containerName="init" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.464119 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" containerName="init" Dec 01 09:37:48 crc kubenswrapper[4763]: E1201 09:37:48.464131 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" containerName="dnsmasq-dns" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.464141 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" containerName="dnsmasq-dns" Dec 01 09:37:48 crc kubenswrapper[4763]: E1201 09:37:48.464169 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2645413f-063f-475a-860d-1876987608fc" containerName="dnsmasq-dns" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.464176 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2645413f-063f-475a-860d-1876987608fc" containerName="dnsmasq-dns" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.464390 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2645413f-063f-475a-860d-1876987608fc" containerName="dnsmasq-dns" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.464410 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e40dea6-5afa-4e2a-a6b3-fc8b81f792c8" containerName="dnsmasq-dns" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.465151 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.469055 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.469756 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.469892 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.470105 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.476685 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d"] Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.561224 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.561334 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2jhx\" (UniqueName: \"kubernetes.io/projected/16706493-814b-4cc5-821d-f484a2059376-kube-api-access-q2jhx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.561403 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.561487 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.663636 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.663764 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.663873 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2jhx\" (UniqueName: \"kubernetes.io/projected/16706493-814b-4cc5-821d-f484a2059376-kube-api-access-q2jhx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.663942 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.672568 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.672650 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.682228 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.688722 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2jhx\" (UniqueName: \"kubernetes.io/projected/16706493-814b-4cc5-821d-f484a2059376-kube-api-access-q2jhx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:48 crc kubenswrapper[4763]: I1201 09:37:48.788107 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:37:49 crc kubenswrapper[4763]: I1201 09:37:49.337439 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d"] Dec 01 09:37:49 crc kubenswrapper[4763]: W1201 09:37:49.344700 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16706493_814b_4cc5_821d_f484a2059376.slice/crio-9235e85461e1068e53b6d1c5f5c42272e1afffee50203760e7f5f477e90dc3bc WatchSource:0}: Error finding container 9235e85461e1068e53b6d1c5f5c42272e1afffee50203760e7f5f477e90dc3bc: Status 404 returned error can't find the container with id 9235e85461e1068e53b6d1c5f5c42272e1afffee50203760e7f5f477e90dc3bc Dec 01 09:37:50 crc kubenswrapper[4763]: I1201 09:37:50.042557 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" event={"ID":"16706493-814b-4cc5-821d-f484a2059376","Type":"ContainerStarted","Data":"9235e85461e1068e53b6d1c5f5c42272e1afffee50203760e7f5f477e90dc3bc"} Dec 01 09:37:59 crc kubenswrapper[4763]: I1201 09:37:59.178571 4763 generic.go:334] "Generic (PLEG): container finished" podID="6d10e5ae-f63a-4bdf-b3f5-2f99e6856799" containerID="7bcb96e0f65f72c9fa4bc1e9a794642dea1f36dc1f51e3df7260d3a6a393f27d" exitCode=0 Dec 01 09:37:59 crc kubenswrapper[4763]: I1201 09:37:59.178765 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799","Type":"ContainerDied","Data":"7bcb96e0f65f72c9fa4bc1e9a794642dea1f36dc1f51e3df7260d3a6a393f27d"} Dec 01 09:37:59 crc kubenswrapper[4763]: I1201 09:37:59.183250 4763 generic.go:334] "Generic (PLEG): container finished" podID="6051a720-a09e-4c11-a9c4-465be3730f65" containerID="fa1c6f65ae42eec43a106d1d83e65265137b61fb65a660b38dc27c6bb32f6d80" exitCode=0 Dec 01 09:37:59 crc kubenswrapper[4763]: I1201 09:37:59.183304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6051a720-a09e-4c11-a9c4-465be3730f65","Type":"ContainerDied","Data":"fa1c6f65ae42eec43a106d1d83e65265137b61fb65a660b38dc27c6bb32f6d80"} Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.241267 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d10e5ae-f63a-4bdf-b3f5-2f99e6856799","Type":"ContainerStarted","Data":"f79d75c3257b066169d169ac0c483b04f653f861896cb5b0c988a499e39f4ebc"} Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.242134 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.247532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6051a720-a09e-4c11-a9c4-465be3730f65","Type":"ContainerStarted","Data":"d50f2b77e9a50deeb84c9b6426a0d79bf2371bb1f929a9a03a69096f5d12ac6e"} Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.247939 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.249713 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" event={"ID":"16706493-814b-4cc5-821d-f484a2059376","Type":"ContainerStarted","Data":"c5a7cebdabb6c4c6fab4657d54f8f5c667aa6223750edba127017d2341db364c"} Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.285716 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.285690959 podStartE2EDuration="40.285690959s" podCreationTimestamp="2025-12-01 09:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:38:03.278875005 +0000 UTC m=+1400.547523773" watchObservedRunningTime="2025-12-01 09:38:03.285690959 +0000 UTC m=+1400.554339727" Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.314355 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.314335545 podStartE2EDuration="40.314335545s" podCreationTimestamp="2025-12-01 09:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 09:38:03.311077272 +0000 UTC m=+1400.579726040" watchObservedRunningTime="2025-12-01 09:38:03.314335545 +0000 UTC m=+1400.582984313" Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.334713 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" podStartSLOduration=2.535251169 podStartE2EDuration="15.334694981s" podCreationTimestamp="2025-12-01 09:37:48 +0000 UTC" firstStartedPulling="2025-12-01 09:37:49.345992313 +0000 UTC m=+1386.614641081" lastFinishedPulling="2025-12-01 09:38:02.145436135 +0000 UTC m=+1399.414084893" observedRunningTime="2025-12-01 09:38:03.329290654 +0000 UTC m=+1400.597939422" watchObservedRunningTime="2025-12-01 09:38:03.334694981 +0000 UTC m=+1400.603343749" Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.929329 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:38:03 crc kubenswrapper[4763]: I1201 09:38:03.929795 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:38:14 crc kubenswrapper[4763]: I1201 09:38:14.161751 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 01 09:38:14 crc kubenswrapper[4763]: I1201 09:38:14.214859 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 01 09:38:14 crc kubenswrapper[4763]: I1201 09:38:14.376970 4763 generic.go:334] "Generic (PLEG): container finished" podID="16706493-814b-4cc5-821d-f484a2059376" containerID="c5a7cebdabb6c4c6fab4657d54f8f5c667aa6223750edba127017d2341db364c" exitCode=0 Dec 01 09:38:14 crc kubenswrapper[4763]: I1201 09:38:14.377010 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" event={"ID":"16706493-814b-4cc5-821d-f484a2059376","Type":"ContainerDied","Data":"c5a7cebdabb6c4c6fab4657d54f8f5c667aa6223750edba127017d2341db364c"} Dec 01 09:38:15 crc kubenswrapper[4763]: I1201 09:38:15.828927 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.002897 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2jhx\" (UniqueName: \"kubernetes.io/projected/16706493-814b-4cc5-821d-f484a2059376-kube-api-access-q2jhx\") pod \"16706493-814b-4cc5-821d-f484a2059376\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.002951 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-inventory\") pod \"16706493-814b-4cc5-821d-f484a2059376\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.003097 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-repo-setup-combined-ca-bundle\") pod \"16706493-814b-4cc5-821d-f484a2059376\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.003689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-ssh-key\") pod \"16706493-814b-4cc5-821d-f484a2059376\" (UID: \"16706493-814b-4cc5-821d-f484a2059376\") " Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.007682 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "16706493-814b-4cc5-821d-f484a2059376" (UID: "16706493-814b-4cc5-821d-f484a2059376"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.016175 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16706493-814b-4cc5-821d-f484a2059376-kube-api-access-q2jhx" (OuterVolumeSpecName: "kube-api-access-q2jhx") pod "16706493-814b-4cc5-821d-f484a2059376" (UID: "16706493-814b-4cc5-821d-f484a2059376"). InnerVolumeSpecName "kube-api-access-q2jhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.033292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-inventory" (OuterVolumeSpecName: "inventory") pod "16706493-814b-4cc5-821d-f484a2059376" (UID: "16706493-814b-4cc5-821d-f484a2059376"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.056557 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "16706493-814b-4cc5-821d-f484a2059376" (UID: "16706493-814b-4cc5-821d-f484a2059376"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.106002 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.106226 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2jhx\" (UniqueName: \"kubernetes.io/projected/16706493-814b-4cc5-821d-f484a2059376-kube-api-access-q2jhx\") on node \"crc\" DevicePath \"\"" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.106239 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.106251 4763 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16706493-814b-4cc5-821d-f484a2059376-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.396163 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" event={"ID":"16706493-814b-4cc5-821d-f484a2059376","Type":"ContainerDied","Data":"9235e85461e1068e53b6d1c5f5c42272e1afffee50203760e7f5f477e90dc3bc"} Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.396199 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9235e85461e1068e53b6d1c5f5c42272e1afffee50203760e7f5f477e90dc3bc" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.396250 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.493666 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm"] Dec 01 09:38:16 crc kubenswrapper[4763]: E1201 09:38:16.496520 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16706493-814b-4cc5-821d-f484a2059376" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.496552 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="16706493-814b-4cc5-821d-f484a2059376" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.496809 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="16706493-814b-4cc5-821d-f484a2059376" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.498193 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.501302 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.501600 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.501790 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.501914 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.508330 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm"] Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.615049 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.615126 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.615177 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.615313 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2s8\" (UniqueName: \"kubernetes.io/projected/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-kube-api-access-mr2s8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.717507 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2s8\" (UniqueName: \"kubernetes.io/projected/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-kube-api-access-mr2s8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.717582 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.717634 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.717680 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.722196 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.722404 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.725025 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.741356 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2s8\" (UniqueName: \"kubernetes.io/projected/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-kube-api-access-mr2s8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:16 crc kubenswrapper[4763]: I1201 09:38:16.873942 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:38:17 crc kubenswrapper[4763]: I1201 09:38:17.559814 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm"] Dec 01 09:38:18 crc kubenswrapper[4763]: I1201 09:38:18.415677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" event={"ID":"a2922a2d-b44c-4e37-9edc-d7e51fdea83f","Type":"ContainerStarted","Data":"4999825601fa720f6c3cf485d0c40831c8c89250b2ad404aee56a5d0a9ae2d6f"} Dec 01 09:38:18 crc kubenswrapper[4763]: I1201 09:38:18.415993 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" event={"ID":"a2922a2d-b44c-4e37-9edc-d7e51fdea83f","Type":"ContainerStarted","Data":"43b77b472f557ddc1c24dbbd25ae92b0342bed05d3347a160cc2031544152672"} Dec 01 09:38:18 crc kubenswrapper[4763]: I1201 09:38:18.462421 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" podStartSLOduration=1.882251525 podStartE2EDuration="2.462402182s" podCreationTimestamp="2025-12-01 09:38:16 +0000 UTC" firstStartedPulling="2025-12-01 09:38:17.543923684 +0000 UTC m=+1414.812572452" lastFinishedPulling="2025-12-01 09:38:18.124074331 +0000 UTC m=+1415.392723109" observedRunningTime="2025-12-01 09:38:18.45405614 +0000 UTC m=+1415.722704918" watchObservedRunningTime="2025-12-01 09:38:18.462402182 +0000 UTC m=+1415.731050950" Dec 01 09:38:33 crc kubenswrapper[4763]: I1201 09:38:33.929796 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:38:33 crc kubenswrapper[4763]: I1201 09:38:33.930262 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:39:01 crc kubenswrapper[4763]: I1201 09:39:01.966077 4763 scope.go:117] "RemoveContainer" containerID="fe5fd0e435360c1c54f9175a2bcb7e7217baa511ea5b558d42a75326d1a62c2e" Dec 01 09:39:03 crc kubenswrapper[4763]: I1201 09:39:03.929531 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:39:03 crc kubenswrapper[4763]: I1201 09:39:03.930072 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:39:03 crc kubenswrapper[4763]: I1201 09:39:03.930122 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:39:03 crc kubenswrapper[4763]: I1201 09:39:03.930899 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57d5657ea17b09564dcba7a4e51f73f6b9a810185f0715911e5b25596bc9c73c"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:39:03 crc kubenswrapper[4763]: I1201 09:39:03.930957 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://57d5657ea17b09564dcba7a4e51f73f6b9a810185f0715911e5b25596bc9c73c" gracePeriod=600 Dec 01 09:39:04 crc kubenswrapper[4763]: I1201 09:39:04.832712 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="57d5657ea17b09564dcba7a4e51f73f6b9a810185f0715911e5b25596bc9c73c" exitCode=0 Dec 01 09:39:04 crc kubenswrapper[4763]: I1201 09:39:04.832795 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"57d5657ea17b09564dcba7a4e51f73f6b9a810185f0715911e5b25596bc9c73c"} Dec 01 09:39:04 crc kubenswrapper[4763]: I1201 09:39:04.833223 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991"} Dec 01 09:39:04 crc kubenswrapper[4763]: I1201 09:39:04.833247 4763 scope.go:117] "RemoveContainer" containerID="cdb76d67e51814424a96785e6ed38c02e1e5ea6f161d5d45ba5cfcfc9064da51" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.051932 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f6rd9"] Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.054895 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.070030 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6rd9"] Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.162666 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-utilities\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.162829 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-catalog-content\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.162926 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r9gc\" (UniqueName: \"kubernetes.io/projected/c9ad798c-434f-44d8-b03c-82aa90db29de-kube-api-access-6r9gc\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.264329 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-catalog-content\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.264388 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r9gc\" (UniqueName: \"kubernetes.io/projected/c9ad798c-434f-44d8-b03c-82aa90db29de-kube-api-access-6r9gc\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.264553 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-utilities\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.264866 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-catalog-content\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.264928 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-utilities\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.288830 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r9gc\" (UniqueName: \"kubernetes.io/projected/c9ad798c-434f-44d8-b03c-82aa90db29de-kube-api-access-6r9gc\") pod \"redhat-marketplace-f6rd9\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.382971 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:39:59 crc kubenswrapper[4763]: I1201 09:39:59.846377 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6rd9"] Dec 01 09:40:00 crc kubenswrapper[4763]: I1201 09:40:00.323089 4763 generic.go:334] "Generic (PLEG): container finished" podID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerID="829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e" exitCode=0 Dec 01 09:40:00 crc kubenswrapper[4763]: I1201 09:40:00.323301 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6rd9" event={"ID":"c9ad798c-434f-44d8-b03c-82aa90db29de","Type":"ContainerDied","Data":"829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e"} Dec 01 09:40:00 crc kubenswrapper[4763]: I1201 09:40:00.324447 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6rd9" event={"ID":"c9ad798c-434f-44d8-b03c-82aa90db29de","Type":"ContainerStarted","Data":"e679f6817264cf6380482bd62f68cc6c070c369e57d263089f646c24acd45928"} Dec 01 09:40:01 crc kubenswrapper[4763]: I1201 09:40:01.335066 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6rd9" event={"ID":"c9ad798c-434f-44d8-b03c-82aa90db29de","Type":"ContainerStarted","Data":"581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85"} Dec 01 09:40:02 crc kubenswrapper[4763]: I1201 09:40:02.021894 4763 scope.go:117] "RemoveContainer" containerID="58f996d498c3eb25ac59916ca1a98387aa1f9cd1a7b24403dd0736eaffd3aa37" Dec 01 09:40:02 crc kubenswrapper[4763]: I1201 09:40:02.344338 4763 generic.go:334] "Generic (PLEG): container finished" podID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerID="581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85" exitCode=0 Dec 01 09:40:02 crc kubenswrapper[4763]: I1201 09:40:02.344504 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6rd9" event={"ID":"c9ad798c-434f-44d8-b03c-82aa90db29de","Type":"ContainerDied","Data":"581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85"} Dec 01 09:40:03 crc kubenswrapper[4763]: I1201 09:40:03.354175 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6rd9" event={"ID":"c9ad798c-434f-44d8-b03c-82aa90db29de","Type":"ContainerStarted","Data":"6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71"} Dec 01 09:40:03 crc kubenswrapper[4763]: I1201 09:40:03.392023 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f6rd9" podStartSLOduration=1.924834056 podStartE2EDuration="4.391991144s" podCreationTimestamp="2025-12-01 09:39:59 +0000 UTC" firstStartedPulling="2025-12-01 09:40:00.32534768 +0000 UTC m=+1517.593996448" lastFinishedPulling="2025-12-01 09:40:02.792504768 +0000 UTC m=+1520.061153536" observedRunningTime="2025-12-01 09:40:03.377102758 +0000 UTC m=+1520.645751546" watchObservedRunningTime="2025-12-01 09:40:03.391991144 +0000 UTC m=+1520.660639932" Dec 01 09:40:09 crc kubenswrapper[4763]: I1201 09:40:09.383704 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:40:09 crc kubenswrapper[4763]: I1201 09:40:09.384105 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:40:09 crc kubenswrapper[4763]: I1201 09:40:09.440930 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:40:09 crc kubenswrapper[4763]: I1201 09:40:09.487065 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:40:09 crc kubenswrapper[4763]: I1201 09:40:09.678223 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6rd9"] Dec 01 09:40:11 crc kubenswrapper[4763]: I1201 09:40:11.430836 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f6rd9" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerName="registry-server" containerID="cri-o://6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71" gracePeriod=2 Dec 01 09:40:11 crc kubenswrapper[4763]: I1201 09:40:11.969933 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.022846 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-catalog-content\") pod \"c9ad798c-434f-44d8-b03c-82aa90db29de\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.022995 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-utilities\") pod \"c9ad798c-434f-44d8-b03c-82aa90db29de\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.023045 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r9gc\" (UniqueName: \"kubernetes.io/projected/c9ad798c-434f-44d8-b03c-82aa90db29de-kube-api-access-6r9gc\") pod \"c9ad798c-434f-44d8-b03c-82aa90db29de\" (UID: \"c9ad798c-434f-44d8-b03c-82aa90db29de\") " Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.023853 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-utilities" (OuterVolumeSpecName: "utilities") pod "c9ad798c-434f-44d8-b03c-82aa90db29de" (UID: "c9ad798c-434f-44d8-b03c-82aa90db29de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.030726 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ad798c-434f-44d8-b03c-82aa90db29de-kube-api-access-6r9gc" (OuterVolumeSpecName: "kube-api-access-6r9gc") pod "c9ad798c-434f-44d8-b03c-82aa90db29de" (UID: "c9ad798c-434f-44d8-b03c-82aa90db29de"). InnerVolumeSpecName "kube-api-access-6r9gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.044731 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9ad798c-434f-44d8-b03c-82aa90db29de" (UID: "c9ad798c-434f-44d8-b03c-82aa90db29de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.125049 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.125391 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ad798c-434f-44d8-b03c-82aa90db29de-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.125405 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r9gc\" (UniqueName: \"kubernetes.io/projected/c9ad798c-434f-44d8-b03c-82aa90db29de-kube-api-access-6r9gc\") on node \"crc\" DevicePath \"\"" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.443250 4763 generic.go:334] "Generic (PLEG): container finished" podID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerID="6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71" exitCode=0 Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.443301 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6rd9" event={"ID":"c9ad798c-434f-44d8-b03c-82aa90db29de","Type":"ContainerDied","Data":"6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71"} Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.443338 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6rd9" event={"ID":"c9ad798c-434f-44d8-b03c-82aa90db29de","Type":"ContainerDied","Data":"e679f6817264cf6380482bd62f68cc6c070c369e57d263089f646c24acd45928"} Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.443361 4763 scope.go:117] "RemoveContainer" containerID="6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.443361 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6rd9" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.492999 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6rd9"] Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.505857 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6rd9"] Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.506665 4763 scope.go:117] "RemoveContainer" containerID="581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.534565 4763 scope.go:117] "RemoveContainer" containerID="829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.571177 4763 scope.go:117] "RemoveContainer" containerID="6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71" Dec 01 09:40:12 crc kubenswrapper[4763]: E1201 09:40:12.571657 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71\": container with ID starting with 6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71 not found: ID does not exist" containerID="6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.571706 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71"} err="failed to get container status \"6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71\": rpc error: code = NotFound desc = could not find container \"6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71\": container with ID starting with 6ed1e77f20b0eeb6c50b9b0238460b9a3f095a8006b3738f2736b6e69c545e71 not found: ID does not exist" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.571740 4763 scope.go:117] "RemoveContainer" containerID="581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85" Dec 01 09:40:12 crc kubenswrapper[4763]: E1201 09:40:12.572171 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85\": container with ID starting with 581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85 not found: ID does not exist" containerID="581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.572202 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85"} err="failed to get container status \"581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85\": rpc error: code = NotFound desc = could not find container \"581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85\": container with ID starting with 581f70bbc4bd00a9326f69debb61c172d550a7dd48875bb41467d0c59e7bbb85 not found: ID does not exist" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.572224 4763 scope.go:117] "RemoveContainer" containerID="829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e" Dec 01 09:40:12 crc kubenswrapper[4763]: E1201 09:40:12.572628 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e\": container with ID starting with 829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e not found: ID does not exist" containerID="829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e" Dec 01 09:40:12 crc kubenswrapper[4763]: I1201 09:40:12.572659 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e"} err="failed to get container status \"829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e\": rpc error: code = NotFound desc = could not find container \"829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e\": container with ID starting with 829b8726ce30157ecbee78b39c3564900149e5dced4311fed704f26e3509888e not found: ID does not exist" Dec 01 09:40:13 crc kubenswrapper[4763]: I1201 09:40:13.009288 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" path="/var/lib/kubelet/pods/c9ad798c-434f-44d8-b03c-82aa90db29de/volumes" Dec 01 09:41:02 crc kubenswrapper[4763]: I1201 09:41:02.309095 4763 scope.go:117] "RemoveContainer" containerID="5c5f23cbedf8cb01e336ca0f3c8eff3b685c9a20ea65db5ce171bacad5aebaeb" Dec 01 09:41:02 crc kubenswrapper[4763]: I1201 09:41:02.332124 4763 scope.go:117] "RemoveContainer" containerID="33d8d8da1b425d65709f154dfa6e7b54a08e7261d1d3b69e069602105c8d1853" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.200806 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-psvsw"] Dec 01 09:41:15 crc kubenswrapper[4763]: E1201 09:41:15.201684 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerName="registry-server" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.201700 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerName="registry-server" Dec 01 09:41:15 crc kubenswrapper[4763]: E1201 09:41:15.201708 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerName="extract-utilities" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.201714 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerName="extract-utilities" Dec 01 09:41:15 crc kubenswrapper[4763]: E1201 09:41:15.201725 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerName="extract-content" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.201732 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerName="extract-content" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.201909 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad798c-434f-44d8-b03c-82aa90db29de" containerName="registry-server" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.203336 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.219757 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-psvsw"] Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.247969 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-catalog-content\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.248040 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-utilities\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.248093 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6h8s\" (UniqueName: \"kubernetes.io/projected/24b950c6-bb69-4bee-9e2f-227915e6fe17-kube-api-access-j6h8s\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.350188 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-utilities\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.350335 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6h8s\" (UniqueName: \"kubernetes.io/projected/24b950c6-bb69-4bee-9e2f-227915e6fe17-kube-api-access-j6h8s\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.350448 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-catalog-content\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.350846 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-utilities\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.351022 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-catalog-content\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.373526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6h8s\" (UniqueName: \"kubernetes.io/projected/24b950c6-bb69-4bee-9e2f-227915e6fe17-kube-api-access-j6h8s\") pod \"community-operators-psvsw\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:15 crc kubenswrapper[4763]: I1201 09:41:15.524800 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:16 crc kubenswrapper[4763]: I1201 09:41:16.002163 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-psvsw"] Dec 01 09:41:16 crc kubenswrapper[4763]: I1201 09:41:16.022600 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-psvsw" event={"ID":"24b950c6-bb69-4bee-9e2f-227915e6fe17","Type":"ContainerStarted","Data":"8e33cb35070631d3812d1eed13a7572b676378819b8b384abf054617423007d6"} Dec 01 09:41:17 crc kubenswrapper[4763]: I1201 09:41:17.036102 4763 generic.go:334] "Generic (PLEG): container finished" podID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerID="5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2" exitCode=0 Dec 01 09:41:17 crc kubenswrapper[4763]: I1201 09:41:17.036197 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-psvsw" event={"ID":"24b950c6-bb69-4bee-9e2f-227915e6fe17","Type":"ContainerDied","Data":"5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2"} Dec 01 09:41:17 crc kubenswrapper[4763]: I1201 09:41:17.038656 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:41:18 crc kubenswrapper[4763]: I1201 09:41:18.051060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-psvsw" event={"ID":"24b950c6-bb69-4bee-9e2f-227915e6fe17","Type":"ContainerStarted","Data":"ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35"} Dec 01 09:41:19 crc kubenswrapper[4763]: I1201 09:41:19.065392 4763 generic.go:334] "Generic (PLEG): container finished" podID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerID="ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35" exitCode=0 Dec 01 09:41:19 crc kubenswrapper[4763]: I1201 09:41:19.065523 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-psvsw" event={"ID":"24b950c6-bb69-4bee-9e2f-227915e6fe17","Type":"ContainerDied","Data":"ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35"} Dec 01 09:41:20 crc kubenswrapper[4763]: I1201 09:41:20.075926 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-psvsw" event={"ID":"24b950c6-bb69-4bee-9e2f-227915e6fe17","Type":"ContainerStarted","Data":"9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c"} Dec 01 09:41:20 crc kubenswrapper[4763]: I1201 09:41:20.101397 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-psvsw" podStartSLOduration=2.664612516 podStartE2EDuration="5.101374713s" podCreationTimestamp="2025-12-01 09:41:15 +0000 UTC" firstStartedPulling="2025-12-01 09:41:17.038384939 +0000 UTC m=+1594.307033707" lastFinishedPulling="2025-12-01 09:41:19.475147126 +0000 UTC m=+1596.743795904" observedRunningTime="2025-12-01 09:41:20.09707304 +0000 UTC m=+1597.365721808" watchObservedRunningTime="2025-12-01 09:41:20.101374713 +0000 UTC m=+1597.370023481" Dec 01 09:41:25 crc kubenswrapper[4763]: I1201 09:41:25.525522 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:25 crc kubenswrapper[4763]: I1201 09:41:25.525776 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:25 crc kubenswrapper[4763]: I1201 09:41:25.569743 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:26 crc kubenswrapper[4763]: I1201 09:41:26.167436 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:26 crc kubenswrapper[4763]: I1201 09:41:26.222230 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-psvsw"] Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.136947 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-psvsw" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerName="registry-server" containerID="cri-o://9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c" gracePeriod=2 Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.552171 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.673415 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6h8s\" (UniqueName: \"kubernetes.io/projected/24b950c6-bb69-4bee-9e2f-227915e6fe17-kube-api-access-j6h8s\") pod \"24b950c6-bb69-4bee-9e2f-227915e6fe17\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.673512 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-catalog-content\") pod \"24b950c6-bb69-4bee-9e2f-227915e6fe17\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.673548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-utilities\") pod \"24b950c6-bb69-4bee-9e2f-227915e6fe17\" (UID: \"24b950c6-bb69-4bee-9e2f-227915e6fe17\") " Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.675177 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-utilities" (OuterVolumeSpecName: "utilities") pod "24b950c6-bb69-4bee-9e2f-227915e6fe17" (UID: "24b950c6-bb69-4bee-9e2f-227915e6fe17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.680070 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b950c6-bb69-4bee-9e2f-227915e6fe17-kube-api-access-j6h8s" (OuterVolumeSpecName: "kube-api-access-j6h8s") pod "24b950c6-bb69-4bee-9e2f-227915e6fe17" (UID: "24b950c6-bb69-4bee-9e2f-227915e6fe17"). InnerVolumeSpecName "kube-api-access-j6h8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.736129 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24b950c6-bb69-4bee-9e2f-227915e6fe17" (UID: "24b950c6-bb69-4bee-9e2f-227915e6fe17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.776058 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.776097 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24b950c6-bb69-4bee-9e2f-227915e6fe17-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:28 crc kubenswrapper[4763]: I1201 09:41:28.776112 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6h8s\" (UniqueName: \"kubernetes.io/projected/24b950c6-bb69-4bee-9e2f-227915e6fe17-kube-api-access-j6h8s\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.149636 4763 generic.go:334] "Generic (PLEG): container finished" podID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerID="9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c" exitCode=0 Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.149701 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-psvsw" event={"ID":"24b950c6-bb69-4bee-9e2f-227915e6fe17","Type":"ContainerDied","Data":"9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c"} Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.149728 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-psvsw" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.149754 4763 scope.go:117] "RemoveContainer" containerID="9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.149739 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-psvsw" event={"ID":"24b950c6-bb69-4bee-9e2f-227915e6fe17","Type":"ContainerDied","Data":"8e33cb35070631d3812d1eed13a7572b676378819b8b384abf054617423007d6"} Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.177275 4763 scope.go:117] "RemoveContainer" containerID="ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.180332 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-psvsw"] Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.191730 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-psvsw"] Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.201869 4763 scope.go:117] "RemoveContainer" containerID="5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.250610 4763 scope.go:117] "RemoveContainer" containerID="9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c" Dec 01 09:41:29 crc kubenswrapper[4763]: E1201 09:41:29.251417 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c\": container with ID starting with 9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c not found: ID does not exist" containerID="9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.251577 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c"} err="failed to get container status \"9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c\": rpc error: code = NotFound desc = could not find container \"9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c\": container with ID starting with 9dfc9f5b3289e44e708489474227b32da69591ff9f8427ac617cfaccd1094b2c not found: ID does not exist" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.251608 4763 scope.go:117] "RemoveContainer" containerID="ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35" Dec 01 09:41:29 crc kubenswrapper[4763]: E1201 09:41:29.251939 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35\": container with ID starting with ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35 not found: ID does not exist" containerID="ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.251972 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35"} err="failed to get container status \"ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35\": rpc error: code = NotFound desc = could not find container \"ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35\": container with ID starting with ea6bb20809b037efef8518174b0ed150dc05708f144d36ca55469835e75e5c35 not found: ID does not exist" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.251997 4763 scope.go:117] "RemoveContainer" containerID="5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2" Dec 01 09:41:29 crc kubenswrapper[4763]: E1201 09:41:29.252210 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2\": container with ID starting with 5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2 not found: ID does not exist" containerID="5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2" Dec 01 09:41:29 crc kubenswrapper[4763]: I1201 09:41:29.252266 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2"} err="failed to get container status \"5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2\": rpc error: code = NotFound desc = could not find container \"5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2\": container with ID starting with 5e677c47a2f93123ed04d50e21c15a40011908c0639aadb0d3f9857c239101e2 not found: ID does not exist" Dec 01 09:41:31 crc kubenswrapper[4763]: I1201 09:41:31.007182 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" path="/var/lib/kubelet/pods/24b950c6-bb69-4bee-9e2f-227915e6fe17/volumes" Dec 01 09:41:33 crc kubenswrapper[4763]: I1201 09:41:33.929539 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:41:33 crc kubenswrapper[4763]: I1201 09:41:33.929891 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:41:35 crc kubenswrapper[4763]: I1201 09:41:35.208603 4763 generic.go:334] "Generic (PLEG): container finished" podID="a2922a2d-b44c-4e37-9edc-d7e51fdea83f" containerID="4999825601fa720f6c3cf485d0c40831c8c89250b2ad404aee56a5d0a9ae2d6f" exitCode=0 Dec 01 09:41:35 crc kubenswrapper[4763]: I1201 09:41:35.208696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" event={"ID":"a2922a2d-b44c-4e37-9edc-d7e51fdea83f","Type":"ContainerDied","Data":"4999825601fa720f6c3cf485d0c40831c8c89250b2ad404aee56a5d0a9ae2d6f"} Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.653329 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.820489 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-inventory\") pod \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.820646 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-ssh-key\") pod \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.820692 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-bootstrap-combined-ca-bundle\") pod \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.820748 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr2s8\" (UniqueName: \"kubernetes.io/projected/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-kube-api-access-mr2s8\") pod \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\" (UID: \"a2922a2d-b44c-4e37-9edc-d7e51fdea83f\") " Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.827172 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-kube-api-access-mr2s8" (OuterVolumeSpecName: "kube-api-access-mr2s8") pod "a2922a2d-b44c-4e37-9edc-d7e51fdea83f" (UID: "a2922a2d-b44c-4e37-9edc-d7e51fdea83f"). InnerVolumeSpecName "kube-api-access-mr2s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.827734 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a2922a2d-b44c-4e37-9edc-d7e51fdea83f" (UID: "a2922a2d-b44c-4e37-9edc-d7e51fdea83f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.850386 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-inventory" (OuterVolumeSpecName: "inventory") pod "a2922a2d-b44c-4e37-9edc-d7e51fdea83f" (UID: "a2922a2d-b44c-4e37-9edc-d7e51fdea83f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.856328 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a2922a2d-b44c-4e37-9edc-d7e51fdea83f" (UID: "a2922a2d-b44c-4e37-9edc-d7e51fdea83f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.922233 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.922271 4763 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.922282 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr2s8\" (UniqueName: \"kubernetes.io/projected/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-kube-api-access-mr2s8\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:36 crc kubenswrapper[4763]: I1201 09:41:36.922291 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a2922a2d-b44c-4e37-9edc-d7e51fdea83f-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.144180 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hcplt"] Dec 01 09:41:37 crc kubenswrapper[4763]: E1201 09:41:37.144602 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2922a2d-b44c-4e37-9edc-d7e51fdea83f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.144624 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2922a2d-b44c-4e37-9edc-d7e51fdea83f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:41:37 crc kubenswrapper[4763]: E1201 09:41:37.144641 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerName="extract-utilities" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.144649 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerName="extract-utilities" Dec 01 09:41:37 crc kubenswrapper[4763]: E1201 09:41:37.144672 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerName="registry-server" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.144678 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerName="registry-server" Dec 01 09:41:37 crc kubenswrapper[4763]: E1201 09:41:37.144688 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerName="extract-content" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.144694 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerName="extract-content" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.144871 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2922a2d-b44c-4e37-9edc-d7e51fdea83f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.144898 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b950c6-bb69-4bee-9e2f-227915e6fe17" containerName="registry-server" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.151306 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.199505 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hcplt"] Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.226314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" event={"ID":"a2922a2d-b44c-4e37-9edc-d7e51fdea83f","Type":"ContainerDied","Data":"43b77b472f557ddc1c24dbbd25ae92b0342bed05d3347a160cc2031544152672"} Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.226593 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b77b472f557ddc1c24dbbd25ae92b0342bed05d3347a160cc2031544152672" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.226734 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.309985 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx"] Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.311376 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.313288 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.313685 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.314007 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.316866 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.329806 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbjc5\" (UniqueName: \"kubernetes.io/projected/7f36e64b-860e-4685-96c0-6bdf8a6ac135-kube-api-access-kbjc5\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.330160 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-catalog-content\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.330315 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-utilities\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.350176 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx"] Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.432480 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbjc5\" (UniqueName: \"kubernetes.io/projected/7f36e64b-860e-4685-96c0-6bdf8a6ac135-kube-api-access-kbjc5\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.432783 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.432831 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.432875 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-catalog-content\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.432940 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw8ml\" (UniqueName: \"kubernetes.io/projected/b0a0f61c-6313-4799-9c99-47866415c99a-kube-api-access-zw8ml\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.432961 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-utilities\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.433348 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-utilities\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.433435 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-catalog-content\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.462131 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbjc5\" (UniqueName: \"kubernetes.io/projected/7f36e64b-860e-4685-96c0-6bdf8a6ac135-kube-api-access-kbjc5\") pod \"certified-operators-hcplt\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.477187 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.534871 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw8ml\" (UniqueName: \"kubernetes.io/projected/b0a0f61c-6313-4799-9c99-47866415c99a-kube-api-access-zw8ml\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.535004 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.535053 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.539549 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.540195 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.556116 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw8ml\" (UniqueName: \"kubernetes.io/projected/b0a0f61c-6313-4799-9c99-47866415c99a-kube-api-access-zw8ml\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-849cx\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.630530 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:41:37 crc kubenswrapper[4763]: I1201 09:41:37.986413 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hcplt"] Dec 01 09:41:38 crc kubenswrapper[4763]: I1201 09:41:38.230350 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx"] Dec 01 09:41:38 crc kubenswrapper[4763]: I1201 09:41:38.235821 4763 generic.go:334] "Generic (PLEG): container finished" podID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerID="a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb" exitCode=0 Dec 01 09:41:38 crc kubenswrapper[4763]: I1201 09:41:38.235860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcplt" event={"ID":"7f36e64b-860e-4685-96c0-6bdf8a6ac135","Type":"ContainerDied","Data":"a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb"} Dec 01 09:41:38 crc kubenswrapper[4763]: I1201 09:41:38.235887 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcplt" event={"ID":"7f36e64b-860e-4685-96c0-6bdf8a6ac135","Type":"ContainerStarted","Data":"e433f0582ea41441a89789d30fb7876e937f56f475a17152aed6ddeccf011d67"} Dec 01 09:41:38 crc kubenswrapper[4763]: W1201 09:41:38.246308 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a0f61c_6313_4799_9c99_47866415c99a.slice/crio-3e26f0b41386c47a0c762bdb41467bc9d25c08b5109d2739ae8b6d1b2e03af5e WatchSource:0}: Error finding container 3e26f0b41386c47a0c762bdb41467bc9d25c08b5109d2739ae8b6d1b2e03af5e: Status 404 returned error can't find the container with id 3e26f0b41386c47a0c762bdb41467bc9d25c08b5109d2739ae8b6d1b2e03af5e Dec 01 09:41:39 crc kubenswrapper[4763]: I1201 09:41:39.246804 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" event={"ID":"b0a0f61c-6313-4799-9c99-47866415c99a","Type":"ContainerStarted","Data":"9f3f5d3da66093649aa278b8d4ac87629454876c9e3a2dc521c37ed532b588c6"} Dec 01 09:41:39 crc kubenswrapper[4763]: I1201 09:41:39.249576 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" event={"ID":"b0a0f61c-6313-4799-9c99-47866415c99a","Type":"ContainerStarted","Data":"3e26f0b41386c47a0c762bdb41467bc9d25c08b5109d2739ae8b6d1b2e03af5e"} Dec 01 09:41:39 crc kubenswrapper[4763]: I1201 09:41:39.251565 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcplt" event={"ID":"7f36e64b-860e-4685-96c0-6bdf8a6ac135","Type":"ContainerStarted","Data":"393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7"} Dec 01 09:41:39 crc kubenswrapper[4763]: I1201 09:41:39.270788 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" podStartSLOduration=1.635244157 podStartE2EDuration="2.27077062s" podCreationTimestamp="2025-12-01 09:41:37 +0000 UTC" firstStartedPulling="2025-12-01 09:41:38.248674093 +0000 UTC m=+1615.517322861" lastFinishedPulling="2025-12-01 09:41:38.884200556 +0000 UTC m=+1616.152849324" observedRunningTime="2025-12-01 09:41:39.261259819 +0000 UTC m=+1616.529908587" watchObservedRunningTime="2025-12-01 09:41:39.27077062 +0000 UTC m=+1616.539419388" Dec 01 09:41:41 crc kubenswrapper[4763]: I1201 09:41:41.270930 4763 generic.go:334] "Generic (PLEG): container finished" podID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerID="393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7" exitCode=0 Dec 01 09:41:41 crc kubenswrapper[4763]: I1201 09:41:41.270999 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcplt" event={"ID":"7f36e64b-860e-4685-96c0-6bdf8a6ac135","Type":"ContainerDied","Data":"393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7"} Dec 01 09:41:42 crc kubenswrapper[4763]: I1201 09:41:42.283913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcplt" event={"ID":"7f36e64b-860e-4685-96c0-6bdf8a6ac135","Type":"ContainerStarted","Data":"88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7"} Dec 01 09:41:42 crc kubenswrapper[4763]: I1201 09:41:42.306298 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hcplt" podStartSLOduration=1.76589241 podStartE2EDuration="5.306281059s" podCreationTimestamp="2025-12-01 09:41:37 +0000 UTC" firstStartedPulling="2025-12-01 09:41:38.238854644 +0000 UTC m=+1615.507503412" lastFinishedPulling="2025-12-01 09:41:41.779243293 +0000 UTC m=+1619.047892061" observedRunningTime="2025-12-01 09:41:42.301988385 +0000 UTC m=+1619.570637153" watchObservedRunningTime="2025-12-01 09:41:42.306281059 +0000 UTC m=+1619.574929827" Dec 01 09:41:47 crc kubenswrapper[4763]: I1201 09:41:47.477700 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:47 crc kubenswrapper[4763]: I1201 09:41:47.478262 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:47 crc kubenswrapper[4763]: I1201 09:41:47.541208 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:48 crc kubenswrapper[4763]: I1201 09:41:48.383963 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:48 crc kubenswrapper[4763]: I1201 09:41:48.443177 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hcplt"] Dec 01 09:41:50 crc kubenswrapper[4763]: I1201 09:41:50.352245 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hcplt" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerName="registry-server" containerID="cri-o://88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7" gracePeriod=2 Dec 01 09:41:50 crc kubenswrapper[4763]: I1201 09:41:50.829876 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.004658 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbjc5\" (UniqueName: \"kubernetes.io/projected/7f36e64b-860e-4685-96c0-6bdf8a6ac135-kube-api-access-kbjc5\") pod \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.004739 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-utilities\") pod \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.004837 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-catalog-content\") pod \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\" (UID: \"7f36e64b-860e-4685-96c0-6bdf8a6ac135\") " Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.005607 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-utilities" (OuterVolumeSpecName: "utilities") pod "7f36e64b-860e-4685-96c0-6bdf8a6ac135" (UID: "7f36e64b-860e-4685-96c0-6bdf8a6ac135"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.005816 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.011706 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f36e64b-860e-4685-96c0-6bdf8a6ac135-kube-api-access-kbjc5" (OuterVolumeSpecName: "kube-api-access-kbjc5") pod "7f36e64b-860e-4685-96c0-6bdf8a6ac135" (UID: "7f36e64b-860e-4685-96c0-6bdf8a6ac135"). InnerVolumeSpecName "kube-api-access-kbjc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.055801 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f36e64b-860e-4685-96c0-6bdf8a6ac135" (UID: "7f36e64b-860e-4685-96c0-6bdf8a6ac135"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.107356 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbjc5\" (UniqueName: \"kubernetes.io/projected/7f36e64b-860e-4685-96c0-6bdf8a6ac135-kube-api-access-kbjc5\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.107388 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f36e64b-860e-4685-96c0-6bdf8a6ac135-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.366169 4763 generic.go:334] "Generic (PLEG): container finished" podID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerID="88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7" exitCode=0 Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.366274 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hcplt" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.366242 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcplt" event={"ID":"7f36e64b-860e-4685-96c0-6bdf8a6ac135","Type":"ContainerDied","Data":"88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7"} Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.366548 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hcplt" event={"ID":"7f36e64b-860e-4685-96c0-6bdf8a6ac135","Type":"ContainerDied","Data":"e433f0582ea41441a89789d30fb7876e937f56f475a17152aed6ddeccf011d67"} Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.366607 4763 scope.go:117] "RemoveContainer" containerID="88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.407701 4763 scope.go:117] "RemoveContainer" containerID="393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.453437 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hcplt"] Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.457976 4763 scope.go:117] "RemoveContainer" containerID="a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.465986 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hcplt"] Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.500188 4763 scope.go:117] "RemoveContainer" containerID="88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7" Dec 01 09:41:51 crc kubenswrapper[4763]: E1201 09:41:51.500789 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7\": container with ID starting with 88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7 not found: ID does not exist" containerID="88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.500916 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7"} err="failed to get container status \"88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7\": rpc error: code = NotFound desc = could not find container \"88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7\": container with ID starting with 88ca8bcad9673df29e7e85569025a48a5a4d959c6979306c57d96ba3a9e8e5a7 not found: ID does not exist" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.501187 4763 scope.go:117] "RemoveContainer" containerID="393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7" Dec 01 09:41:51 crc kubenswrapper[4763]: E1201 09:41:51.501947 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7\": container with ID starting with 393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7 not found: ID does not exist" containerID="393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.502093 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7"} err="failed to get container status \"393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7\": rpc error: code = NotFound desc = could not find container \"393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7\": container with ID starting with 393678125b8d3023163295320be19a26f21c50b52819d5da536a66dadfd00ab7 not found: ID does not exist" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.502180 4763 scope.go:117] "RemoveContainer" containerID="a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb" Dec 01 09:41:51 crc kubenswrapper[4763]: E1201 09:41:51.502746 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb\": container with ID starting with a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb not found: ID does not exist" containerID="a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb" Dec 01 09:41:51 crc kubenswrapper[4763]: I1201 09:41:51.502781 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb"} err="failed to get container status \"a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb\": rpc error: code = NotFound desc = could not find container \"a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb\": container with ID starting with a863a0e667d5487866cc8620a6bbc66d99a2383dbf586e1a6179f7fe90784ffb not found: ID does not exist" Dec 01 09:41:53 crc kubenswrapper[4763]: I1201 09:41:53.028425 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" path="/var/lib/kubelet/pods/7f36e64b-860e-4685-96c0-6bdf8a6ac135/volumes" Dec 01 09:42:03 crc kubenswrapper[4763]: I1201 09:42:03.930171 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:42:03 crc kubenswrapper[4763]: I1201 09:42:03.930692 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:42:16 crc kubenswrapper[4763]: I1201 09:42:16.053628 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3641-account-create-update-fzfbs"] Dec 01 09:42:16 crc kubenswrapper[4763]: I1201 09:42:16.061394 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lfskc"] Dec 01 09:42:16 crc kubenswrapper[4763]: I1201 09:42:16.070997 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3641-account-create-update-fzfbs"] Dec 01 09:42:16 crc kubenswrapper[4763]: I1201 09:42:16.078484 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lfskc"] Dec 01 09:42:17 crc kubenswrapper[4763]: I1201 09:42:17.006658 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a341bd-d0a9-4c8f-a8e6-3941faa51e05" path="/var/lib/kubelet/pods/30a341bd-d0a9-4c8f-a8e6-3941faa51e05/volumes" Dec 01 09:42:17 crc kubenswrapper[4763]: I1201 09:42:17.007215 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9aeec68-c0a0-4c88-994c-b229ef8aa223" path="/var/lib/kubelet/pods/c9aeec68-c0a0-4c88-994c-b229ef8aa223/volumes" Dec 01 09:42:23 crc kubenswrapper[4763]: I1201 09:42:23.043413 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fd4n8"] Dec 01 09:42:23 crc kubenswrapper[4763]: I1201 09:42:23.054420 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xqvdp"] Dec 01 09:42:23 crc kubenswrapper[4763]: I1201 09:42:23.063812 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fd4n8"] Dec 01 09:42:23 crc kubenswrapper[4763]: I1201 09:42:23.071542 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xqvdp"] Dec 01 09:42:24 crc kubenswrapper[4763]: I1201 09:42:24.025963 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-35ab-account-create-update-7jbxv"] Dec 01 09:42:24 crc kubenswrapper[4763]: I1201 09:42:24.034721 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f80b-account-create-update-bmk58"] Dec 01 09:42:24 crc kubenswrapper[4763]: I1201 09:42:24.044225 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f80b-account-create-update-bmk58"] Dec 01 09:42:24 crc kubenswrapper[4763]: I1201 09:42:24.055612 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-35ab-account-create-update-7jbxv"] Dec 01 09:42:25 crc kubenswrapper[4763]: I1201 09:42:25.005213 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69b412ca-2a25-4362-a6f1-67fbc091e68b" path="/var/lib/kubelet/pods/69b412ca-2a25-4362-a6f1-67fbc091e68b/volumes" Dec 01 09:42:25 crc kubenswrapper[4763]: I1201 09:42:25.006394 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afea76a-bb71-4571-bb88-c221d4a5448d" path="/var/lib/kubelet/pods/7afea76a-bb71-4571-bb88-c221d4a5448d/volumes" Dec 01 09:42:25 crc kubenswrapper[4763]: I1201 09:42:25.007199 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca6055d4-f54b-4ed5-b566-cb9be368fb20" path="/var/lib/kubelet/pods/ca6055d4-f54b-4ed5-b566-cb9be368fb20/volumes" Dec 01 09:42:25 crc kubenswrapper[4763]: I1201 09:42:25.007736 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb77b336-6911-4e22-888a-0a8e20435893" path="/var/lib/kubelet/pods/eb77b336-6911-4e22-888a-0a8e20435893/volumes" Dec 01 09:42:33 crc kubenswrapper[4763]: I1201 09:42:33.928889 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:42:33 crc kubenswrapper[4763]: I1201 09:42:33.929679 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:42:33 crc kubenswrapper[4763]: I1201 09:42:33.929757 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:42:33 crc kubenswrapper[4763]: I1201 09:42:33.931846 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:42:33 crc kubenswrapper[4763]: I1201 09:42:33.931912 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" gracePeriod=600 Dec 01 09:42:34 crc kubenswrapper[4763]: E1201 09:42:34.064559 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:42:34 crc kubenswrapper[4763]: I1201 09:42:34.733153 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" exitCode=0 Dec 01 09:42:34 crc kubenswrapper[4763]: I1201 09:42:34.733205 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991"} Dec 01 09:42:34 crc kubenswrapper[4763]: I1201 09:42:34.733539 4763 scope.go:117] "RemoveContainer" containerID="57d5657ea17b09564dcba7a4e51f73f6b9a810185f0715911e5b25596bc9c73c" Dec 01 09:42:34 crc kubenswrapper[4763]: I1201 09:42:34.734278 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:42:34 crc kubenswrapper[4763]: E1201 09:42:34.734646 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:42:47 crc kubenswrapper[4763]: I1201 09:42:47.995056 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:42:47 crc kubenswrapper[4763]: E1201 09:42:47.995915 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:42:56 crc kubenswrapper[4763]: I1201 09:42:56.960980 4763 generic.go:334] "Generic (PLEG): container finished" podID="b0a0f61c-6313-4799-9c99-47866415c99a" containerID="9f3f5d3da66093649aa278b8d4ac87629454876c9e3a2dc521c37ed532b588c6" exitCode=0 Dec 01 09:42:56 crc kubenswrapper[4763]: I1201 09:42:56.961160 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" event={"ID":"b0a0f61c-6313-4799-9c99-47866415c99a","Type":"ContainerDied","Data":"9f3f5d3da66093649aa278b8d4ac87629454876c9e3a2dc521c37ed532b588c6"} Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.440684 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.520528 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-inventory\") pod \"b0a0f61c-6313-4799-9c99-47866415c99a\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.520660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-ssh-key\") pod \"b0a0f61c-6313-4799-9c99-47866415c99a\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.520724 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw8ml\" (UniqueName: \"kubernetes.io/projected/b0a0f61c-6313-4799-9c99-47866415c99a-kube-api-access-zw8ml\") pod \"b0a0f61c-6313-4799-9c99-47866415c99a\" (UID: \"b0a0f61c-6313-4799-9c99-47866415c99a\") " Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.535782 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a0f61c-6313-4799-9c99-47866415c99a-kube-api-access-zw8ml" (OuterVolumeSpecName: "kube-api-access-zw8ml") pod "b0a0f61c-6313-4799-9c99-47866415c99a" (UID: "b0a0f61c-6313-4799-9c99-47866415c99a"). InnerVolumeSpecName "kube-api-access-zw8ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.548350 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-inventory" (OuterVolumeSpecName: "inventory") pod "b0a0f61c-6313-4799-9c99-47866415c99a" (UID: "b0a0f61c-6313-4799-9c99-47866415c99a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.560411 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0a0f61c-6313-4799-9c99-47866415c99a" (UID: "b0a0f61c-6313-4799-9c99-47866415c99a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.624220 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw8ml\" (UniqueName: \"kubernetes.io/projected/b0a0f61c-6313-4799-9c99-47866415c99a-kube-api-access-zw8ml\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.624299 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.624314 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0a0f61c-6313-4799-9c99-47866415c99a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.989742 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" event={"ID":"b0a0f61c-6313-4799-9c99-47866415c99a","Type":"ContainerDied","Data":"3e26f0b41386c47a0c762bdb41467bc9d25c08b5109d2739ae8b6d1b2e03af5e"} Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.989782 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e26f0b41386c47a0c762bdb41467bc9d25c08b5109d2739ae8b6d1b2e03af5e" Dec 01 09:42:58 crc kubenswrapper[4763]: I1201 09:42:58.990316 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.081858 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt"] Dec 01 09:42:59 crc kubenswrapper[4763]: E1201 09:42:59.082206 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerName="registry-server" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.082221 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerName="registry-server" Dec 01 09:42:59 crc kubenswrapper[4763]: E1201 09:42:59.082240 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerName="extract-utilities" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.082246 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerName="extract-utilities" Dec 01 09:42:59 crc kubenswrapper[4763]: E1201 09:42:59.082258 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a0f61c-6313-4799-9c99-47866415c99a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.082265 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a0f61c-6313-4799-9c99-47866415c99a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:59 crc kubenswrapper[4763]: E1201 09:42:59.082280 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerName="extract-content" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.082285 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerName="extract-content" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.082470 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f36e64b-860e-4685-96c0-6bdf8a6ac135" containerName="registry-server" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.082501 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a0f61c-6313-4799-9c99-47866415c99a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.083084 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.085890 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.086071 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.086194 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.090246 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.099934 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt"] Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.131726 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.131894 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56k47\" (UniqueName: \"kubernetes.io/projected/852eca7e-7a83-49d7-9950-aabc202ec4ec-kube-api-access-56k47\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.132179 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.234097 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.234173 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.234245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56k47\" (UniqueName: \"kubernetes.io/projected/852eca7e-7a83-49d7-9950-aabc202ec4ec-kube-api-access-56k47\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.237487 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.238061 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.252008 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56k47\" (UniqueName: \"kubernetes.io/projected/852eca7e-7a83-49d7-9950-aabc202ec4ec-kube-api-access-56k47\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.405156 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:42:59 crc kubenswrapper[4763]: I1201 09:42:59.947373 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt"] Dec 01 09:43:00 crc kubenswrapper[4763]: I1201 09:43:00.001983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" event={"ID":"852eca7e-7a83-49d7-9950-aabc202ec4ec","Type":"ContainerStarted","Data":"b98f30976fe199d95a4a868de06ae60f5c5d138cfea65d15563be4b11d4bf398"} Dec 01 09:43:01 crc kubenswrapper[4763]: I1201 09:43:01.022807 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" event={"ID":"852eca7e-7a83-49d7-9950-aabc202ec4ec","Type":"ContainerStarted","Data":"ee920331a3d92b032e0fe611dc65f142fec59e6cdb2dbd316e404b35845892ac"} Dec 01 09:43:01 crc kubenswrapper[4763]: I1201 09:43:01.053925 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" podStartSLOduration=1.541114629 podStartE2EDuration="2.053905266s" podCreationTimestamp="2025-12-01 09:42:59 +0000 UTC" firstStartedPulling="2025-12-01 09:42:59.96104405 +0000 UTC m=+1697.229692818" lastFinishedPulling="2025-12-01 09:43:00.473834687 +0000 UTC m=+1697.742483455" observedRunningTime="2025-12-01 09:43:01.051255623 +0000 UTC m=+1698.319904401" watchObservedRunningTime="2025-12-01 09:43:01.053905266 +0000 UTC m=+1698.322554034" Dec 01 09:43:01 crc kubenswrapper[4763]: I1201 09:43:01.994303 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:43:01 crc kubenswrapper[4763]: E1201 09:43:01.994849 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.051169 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-485b-account-create-update-6l8rt"] Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.063008 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-abc0-account-create-update-74tjb"] Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.072344 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-485b-account-create-update-6l8rt"] Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.079610 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-abc0-account-create-update-74tjb"] Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.469871 4763 scope.go:117] "RemoveContainer" containerID="43db7f4282f3c4f7cff725f26f345432c8c6593fbfb96944b1b4e284cfea8d7f" Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.497862 4763 scope.go:117] "RemoveContainer" containerID="8d079fe5b541b44d6fb1f052c2687cb9a7becb442ce8bbc108e5efb3983f1ab7" Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.522158 4763 scope.go:117] "RemoveContainer" containerID="fede3a5effb497c1ab7d6e74f5b866e2c8f3dc2a2652c587948508fa342c9ff5" Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.570008 4763 scope.go:117] "RemoveContainer" containerID="26bc2ff8669dc59d6ea2499d5bc9ba97f352c002d3b43660fa6ef1c3ee31da86" Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.615733 4763 scope.go:117] "RemoveContainer" containerID="25c0d181959c35a62c6b6ea3f219633715f9c19cf2ae403d0a07c6050b85bed5" Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.668784 4763 scope.go:117] "RemoveContainer" containerID="dd9c61740051e58d77cdcfab86e5eaa3b2d704cafb181d8664e8da2be0498b42" Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.712049 4763 scope.go:117] "RemoveContainer" containerID="f64a184dac5e764ff6478bd366a1de4d336b7ebb88b379420fd2801046f5a0db" Dec 01 09:43:02 crc kubenswrapper[4763]: I1201 09:43:02.732936 4763 scope.go:117] "RemoveContainer" containerID="a562aea9491cc2cba051117385028e9a429d54c7b13fa276d7b918da2580ad4d" Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.019064 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef3f761-a29f-493f-8ace-68422774cc27" path="/var/lib/kubelet/pods/6ef3f761-a29f-493f-8ace-68422774cc27/volumes" Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.020006 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8d0192-f2bf-4dfb-b7b7-fba83523017e" path="/var/lib/kubelet/pods/dc8d0192-f2bf-4dfb-b7b7-fba83523017e/volumes" Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.046700 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-wk8xf"] Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.055989 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-42dt5"] Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.067677 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-42dt5"] Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.078320 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-drwnp"] Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.087682 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4e18-account-create-update-b6f8x"] Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.096137 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-wk8xf"] Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.104672 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4e18-account-create-update-b6f8x"] Dec 01 09:43:03 crc kubenswrapper[4763]: I1201 09:43:03.112668 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-drwnp"] Dec 01 09:43:05 crc kubenswrapper[4763]: I1201 09:43:05.005982 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05241c5a-6373-4ec7-a9fc-737e27036bb0" path="/var/lib/kubelet/pods/05241c5a-6373-4ec7-a9fc-737e27036bb0/volumes" Dec 01 09:43:05 crc kubenswrapper[4763]: I1201 09:43:05.006912 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85027a6d-3549-4e56-80b6-b8133b118eae" path="/var/lib/kubelet/pods/85027a6d-3549-4e56-80b6-b8133b118eae/volumes" Dec 01 09:43:05 crc kubenswrapper[4763]: I1201 09:43:05.007393 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b805f491-7378-4eae-8c43-af5315818571" path="/var/lib/kubelet/pods/b805f491-7378-4eae-8c43-af5315818571/volumes" Dec 01 09:43:05 crc kubenswrapper[4763]: I1201 09:43:05.008039 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a0b7a2-5ddf-4b28-8d66-73c0b126b740" path="/var/lib/kubelet/pods/d0a0b7a2-5ddf-4b28-8d66-73c0b126b740/volumes" Dec 01 09:43:06 crc kubenswrapper[4763]: I1201 09:43:06.069868 4763 generic.go:334] "Generic (PLEG): container finished" podID="852eca7e-7a83-49d7-9950-aabc202ec4ec" containerID="ee920331a3d92b032e0fe611dc65f142fec59e6cdb2dbd316e404b35845892ac" exitCode=0 Dec 01 09:43:06 crc kubenswrapper[4763]: I1201 09:43:06.069913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" event={"ID":"852eca7e-7a83-49d7-9950-aabc202ec4ec","Type":"ContainerDied","Data":"ee920331a3d92b032e0fe611dc65f142fec59e6cdb2dbd316e404b35845892ac"} Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.480589 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.608629 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-ssh-key\") pod \"852eca7e-7a83-49d7-9950-aabc202ec4ec\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.608955 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56k47\" (UniqueName: \"kubernetes.io/projected/852eca7e-7a83-49d7-9950-aabc202ec4ec-kube-api-access-56k47\") pod \"852eca7e-7a83-49d7-9950-aabc202ec4ec\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.609029 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-inventory\") pod \"852eca7e-7a83-49d7-9950-aabc202ec4ec\" (UID: \"852eca7e-7a83-49d7-9950-aabc202ec4ec\") " Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.616686 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/852eca7e-7a83-49d7-9950-aabc202ec4ec-kube-api-access-56k47" (OuterVolumeSpecName: "kube-api-access-56k47") pod "852eca7e-7a83-49d7-9950-aabc202ec4ec" (UID: "852eca7e-7a83-49d7-9950-aabc202ec4ec"). InnerVolumeSpecName "kube-api-access-56k47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.654119 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "852eca7e-7a83-49d7-9950-aabc202ec4ec" (UID: "852eca7e-7a83-49d7-9950-aabc202ec4ec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.667026 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-inventory" (OuterVolumeSpecName: "inventory") pod "852eca7e-7a83-49d7-9950-aabc202ec4ec" (UID: "852eca7e-7a83-49d7-9950-aabc202ec4ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.711537 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.711574 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56k47\" (UniqueName: \"kubernetes.io/projected/852eca7e-7a83-49d7-9950-aabc202ec4ec-kube-api-access-56k47\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:07 crc kubenswrapper[4763]: I1201 09:43:07.711591 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/852eca7e-7a83-49d7-9950-aabc202ec4ec-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.091537 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" event={"ID":"852eca7e-7a83-49d7-9950-aabc202ec4ec","Type":"ContainerDied","Data":"b98f30976fe199d95a4a868de06ae60f5c5d138cfea65d15563be4b11d4bf398"} Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.091597 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b98f30976fe199d95a4a868de06ae60f5c5d138cfea65d15563be4b11d4bf398" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.091633 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.197138 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx"] Dec 01 09:43:08 crc kubenswrapper[4763]: E1201 09:43:08.197571 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="852eca7e-7a83-49d7-9950-aabc202ec4ec" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.197590 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="852eca7e-7a83-49d7-9950-aabc202ec4ec" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.197762 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="852eca7e-7a83-49d7-9950-aabc202ec4ec" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.198397 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.200706 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.202163 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.202311 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.202426 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.258830 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx"] Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.322907 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.323369 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.323668 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shd8p\" (UniqueName: \"kubernetes.io/projected/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-kube-api-access-shd8p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.425260 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.425306 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shd8p\" (UniqueName: \"kubernetes.io/projected/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-kube-api-access-shd8p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.425435 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.429285 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.429762 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.448829 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shd8p\" (UniqueName: \"kubernetes.io/projected/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-kube-api-access-shd8p\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-m5tgx\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:08 crc kubenswrapper[4763]: I1201 09:43:08.520317 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:09 crc kubenswrapper[4763]: I1201 09:43:09.065695 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx"] Dec 01 09:43:09 crc kubenswrapper[4763]: I1201 09:43:09.108719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" event={"ID":"25ea7914-fefd-4dd4-a1aa-f0593874e6c2","Type":"ContainerStarted","Data":"100a150b9c4a5b4c7ba5b40c558b35f2a7cb5067362d881c25a366f3329ffb50"} Dec 01 09:43:10 crc kubenswrapper[4763]: I1201 09:43:10.118110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" event={"ID":"25ea7914-fefd-4dd4-a1aa-f0593874e6c2","Type":"ContainerStarted","Data":"4df950c366625a1709d317852f7fe9b799184b0fc7592c128a1cac2f99788d89"} Dec 01 09:43:10 crc kubenswrapper[4763]: I1201 09:43:10.140558 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" podStartSLOduration=1.6821991779999999 podStartE2EDuration="2.140438505s" podCreationTimestamp="2025-12-01 09:43:08 +0000 UTC" firstStartedPulling="2025-12-01 09:43:09.089489017 +0000 UTC m=+1706.358137785" lastFinishedPulling="2025-12-01 09:43:09.547728324 +0000 UTC m=+1706.816377112" observedRunningTime="2025-12-01 09:43:10.134906201 +0000 UTC m=+1707.403554979" watchObservedRunningTime="2025-12-01 09:43:10.140438505 +0000 UTC m=+1707.409087273" Dec 01 09:43:12 crc kubenswrapper[4763]: I1201 09:43:12.037566 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-m2rxm"] Dec 01 09:43:12 crc kubenswrapper[4763]: I1201 09:43:12.045193 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-m2rxm"] Dec 01 09:43:13 crc kubenswrapper[4763]: I1201 09:43:13.003429 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b221a534-cb94-42b4-af86-da3c66d827d5" path="/var/lib/kubelet/pods/b221a534-cb94-42b4-af86-da3c66d827d5/volumes" Dec 01 09:43:16 crc kubenswrapper[4763]: I1201 09:43:16.995505 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:43:16 crc kubenswrapper[4763]: E1201 09:43:16.996534 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:43:29 crc kubenswrapper[4763]: I1201 09:43:29.993920 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:43:29 crc kubenswrapper[4763]: E1201 09:43:29.994702 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:43:39 crc kubenswrapper[4763]: I1201 09:43:39.062912 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-glb7w"] Dec 01 09:43:39 crc kubenswrapper[4763]: I1201 09:43:39.071474 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-glb7w"] Dec 01 09:43:40 crc kubenswrapper[4763]: I1201 09:43:40.993963 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:43:40 crc kubenswrapper[4763]: E1201 09:43:40.994347 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:43:41 crc kubenswrapper[4763]: I1201 09:43:41.005528 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffbeb86-fa1d-4382-b9eb-a51bea87c540" path="/var/lib/kubelet/pods/5ffbeb86-fa1d-4382-b9eb-a51bea87c540/volumes" Dec 01 09:43:50 crc kubenswrapper[4763]: I1201 09:43:50.048816 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j4mtm"] Dec 01 09:43:50 crc kubenswrapper[4763]: I1201 09:43:50.059313 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j4mtm"] Dec 01 09:43:51 crc kubenswrapper[4763]: I1201 09:43:51.005663 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9ba86b1-459e-422a-859e-95a9901d8d93" path="/var/lib/kubelet/pods/a9ba86b1-459e-422a-859e-95a9901d8d93/volumes" Dec 01 09:43:52 crc kubenswrapper[4763]: I1201 09:43:52.030720 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-v52zn"] Dec 01 09:43:52 crc kubenswrapper[4763]: I1201 09:43:52.038716 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-w9ssn"] Dec 01 09:43:52 crc kubenswrapper[4763]: I1201 09:43:52.046079 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-w9ssn"] Dec 01 09:43:52 crc kubenswrapper[4763]: I1201 09:43:52.054974 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-v52zn"] Dec 01 09:43:53 crc kubenswrapper[4763]: I1201 09:43:53.007812 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071c6297-c7b9-44fd-807a-43b881312f92" path="/var/lib/kubelet/pods/071c6297-c7b9-44fd-807a-43b881312f92/volumes" Dec 01 09:43:53 crc kubenswrapper[4763]: I1201 09:43:53.009047 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d785a5a5-a9b9-45f0-b36a-a66fd298bfac" path="/var/lib/kubelet/pods/d785a5a5-a9b9-45f0-b36a-a66fd298bfac/volumes" Dec 01 09:43:54 crc kubenswrapper[4763]: I1201 09:43:54.539899 4763 generic.go:334] "Generic (PLEG): container finished" podID="25ea7914-fefd-4dd4-a1aa-f0593874e6c2" containerID="4df950c366625a1709d317852f7fe9b799184b0fc7592c128a1cac2f99788d89" exitCode=0 Dec 01 09:43:54 crc kubenswrapper[4763]: I1201 09:43:54.539983 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" event={"ID":"25ea7914-fefd-4dd4-a1aa-f0593874e6c2","Type":"ContainerDied","Data":"4df950c366625a1709d317852f7fe9b799184b0fc7592c128a1cac2f99788d89"} Dec 01 09:43:55 crc kubenswrapper[4763]: I1201 09:43:55.931132 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.011035 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shd8p\" (UniqueName: \"kubernetes.io/projected/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-kube-api-access-shd8p\") pod \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.011268 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-inventory\") pod \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.011374 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-ssh-key\") pod \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\" (UID: \"25ea7914-fefd-4dd4-a1aa-f0593874e6c2\") " Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.013028 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:43:56 crc kubenswrapper[4763]: E1201 09:43:56.013537 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.038364 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-kube-api-access-shd8p" (OuterVolumeSpecName: "kube-api-access-shd8p") pod "25ea7914-fefd-4dd4-a1aa-f0593874e6c2" (UID: "25ea7914-fefd-4dd4-a1aa-f0593874e6c2"). InnerVolumeSpecName "kube-api-access-shd8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.111650 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-inventory" (OuterVolumeSpecName: "inventory") pod "25ea7914-fefd-4dd4-a1aa-f0593874e6c2" (UID: "25ea7914-fefd-4dd4-a1aa-f0593874e6c2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.114880 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.114920 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shd8p\" (UniqueName: \"kubernetes.io/projected/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-kube-api-access-shd8p\") on node \"crc\" DevicePath \"\"" Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.131598 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "25ea7914-fefd-4dd4-a1aa-f0593874e6c2" (UID: "25ea7914-fefd-4dd4-a1aa-f0593874e6c2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:43:56 crc kubenswrapper[4763]: I1201 09:43:56.216476 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/25ea7914-fefd-4dd4-a1aa-f0593874e6c2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.557032 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" event={"ID":"25ea7914-fefd-4dd4-a1aa-f0593874e6c2","Type":"ContainerDied","Data":"100a150b9c4a5b4c7ba5b40c558b35f2a7cb5067362d881c25a366f3329ffb50"} Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.557067 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="100a150b9c4a5b4c7ba5b40c558b35f2a7cb5067362d881c25a366f3329ffb50" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.557086 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.644514 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9"] Dec 01 09:44:01 crc kubenswrapper[4763]: E1201 09:43:56.644932 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ea7914-fefd-4dd4-a1aa-f0593874e6c2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.644944 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ea7914-fefd-4dd4-a1aa-f0593874e6c2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.645132 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ea7914-fefd-4dd4-a1aa-f0593874e6c2" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.645826 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.650071 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.650304 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.650511 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.655907 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.666785 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9"] Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.724385 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.724534 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62fj4\" (UniqueName: \"kubernetes.io/projected/a59b383d-c6e5-43da-8066-484e944a3ea8-kube-api-access-62fj4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.724582 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.826096 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.826286 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.826325 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62fj4\" (UniqueName: \"kubernetes.io/projected/a59b383d-c6e5-43da-8066-484e944a3ea8-kube-api-access-62fj4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.831042 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.831772 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.848519 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62fj4\" (UniqueName: \"kubernetes.io/projected/a59b383d-c6e5-43da-8066-484e944a3ea8-kube-api-access-62fj4\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:43:56.969122 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:44:00.465482 4763 patch_prober.go:28] interesting pod/controller-manager-884c54fcb-vnxbq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 09:44:01 crc kubenswrapper[4763]: I1201 09:44:00.465820 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-884c54fcb-vnxbq" podUID="e5273af4-4768-4ce2-a080-32bdcb3527cc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 09:44:02 crc kubenswrapper[4763]: I1201 09:44:02.363408 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9"] Dec 01 09:44:02 crc kubenswrapper[4763]: I1201 09:44:02.606658 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" event={"ID":"a59b383d-c6e5-43da-8066-484e944a3ea8","Type":"ContainerStarted","Data":"315fe66fd7dc8d4e08e606ceb8fc1a8fab4b5dc57da9305c86bf30c700db7c59"} Dec 01 09:44:02 crc kubenswrapper[4763]: I1201 09:44:02.931700 4763 scope.go:117] "RemoveContainer" containerID="426f6c4e94728dd50e1965a4abfbc0e39468a9a7931ff72315cfc98f7bfb584f" Dec 01 09:44:02 crc kubenswrapper[4763]: I1201 09:44:02.994751 4763 scope.go:117] "RemoveContainer" containerID="8ca12f3f037c2e21aecaaa07ea094c46dfb42a47fc7e7a6985db3c5500df52cc" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.049294 4763 scope.go:117] "RemoveContainer" containerID="d13a60015043ad5554df404d33189aa6990d47b71ad546a479a1c8654d6a03d8" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.146513 4763 scope.go:117] "RemoveContainer" containerID="f1d6a1b7ba846a48f945ae146dddbcadfba7d1fcd6f8ecb750d417e7408b7cb2" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.251308 4763 scope.go:117] "RemoveContainer" containerID="c62d1f1acd610597fd9d3c1da14c73ed1475bfe8907cab68703f4782777d78e5" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.275603 4763 scope.go:117] "RemoveContainer" containerID="e37beafcdc97dcbfd0a7626e9a9bc5a77f0abe7b6dcc633b88918edcd967be40" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.297664 4763 scope.go:117] "RemoveContainer" containerID="23c13ea7c64fcd259467f9887f5be246b537ba8010d5e7e0911f6f48251109d7" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.338296 4763 scope.go:117] "RemoveContainer" containerID="337aad30eae8e92710957f5d8af1dced83f62a1d549f8559a77714f4cf87bef9" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.359143 4763 scope.go:117] "RemoveContainer" containerID="059680f31240098fe413b8617fe9d99490758b5f2086fc49ac1d036b45555084" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.440092 4763 scope.go:117] "RemoveContainer" containerID="ae71100dd4a342ac7b18e44eb49d8dff2017203a3eac025a57e43236d1d7787d" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.471838 4763 scope.go:117] "RemoveContainer" containerID="e933b4f6e7d82e8340a2085be91679a971ad5780f8e2701a180ef7c495cf8b2a" Dec 01 09:44:03 crc kubenswrapper[4763]: I1201 09:44:03.622059 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" event={"ID":"a59b383d-c6e5-43da-8066-484e944a3ea8","Type":"ContainerStarted","Data":"91c844f118225e46f64f77db9eec96e265f22dd5d99dd382eec99958c04c31b3"} Dec 01 09:44:08 crc kubenswrapper[4763]: I1201 09:44:08.676943 4763 generic.go:334] "Generic (PLEG): container finished" podID="a59b383d-c6e5-43da-8066-484e944a3ea8" containerID="91c844f118225e46f64f77db9eec96e265f22dd5d99dd382eec99958c04c31b3" exitCode=0 Dec 01 09:44:08 crc kubenswrapper[4763]: I1201 09:44:08.677103 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" event={"ID":"a59b383d-c6e5-43da-8066-484e944a3ea8","Type":"ContainerDied","Data":"91c844f118225e46f64f77db9eec96e265f22dd5d99dd382eec99958c04c31b3"} Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.037805 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xhw7j"] Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.048836 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xhw7j"] Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.097585 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.194760 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-inventory\") pod \"a59b383d-c6e5-43da-8066-484e944a3ea8\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.194834 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62fj4\" (UniqueName: \"kubernetes.io/projected/a59b383d-c6e5-43da-8066-484e944a3ea8-kube-api-access-62fj4\") pod \"a59b383d-c6e5-43da-8066-484e944a3ea8\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.194966 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-ssh-key\") pod \"a59b383d-c6e5-43da-8066-484e944a3ea8\" (UID: \"a59b383d-c6e5-43da-8066-484e944a3ea8\") " Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.204473 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59b383d-c6e5-43da-8066-484e944a3ea8-kube-api-access-62fj4" (OuterVolumeSpecName: "kube-api-access-62fj4") pod "a59b383d-c6e5-43da-8066-484e944a3ea8" (UID: "a59b383d-c6e5-43da-8066-484e944a3ea8"). InnerVolumeSpecName "kube-api-access-62fj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.225868 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-inventory" (OuterVolumeSpecName: "inventory") pod "a59b383d-c6e5-43da-8066-484e944a3ea8" (UID: "a59b383d-c6e5-43da-8066-484e944a3ea8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.236661 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a59b383d-c6e5-43da-8066-484e944a3ea8" (UID: "a59b383d-c6e5-43da-8066-484e944a3ea8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.299089 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.299435 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62fj4\" (UniqueName: \"kubernetes.io/projected/a59b383d-c6e5-43da-8066-484e944a3ea8-kube-api-access-62fj4\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.299451 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a59b383d-c6e5-43da-8066-484e944a3ea8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.694546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" event={"ID":"a59b383d-c6e5-43da-8066-484e944a3ea8","Type":"ContainerDied","Data":"315fe66fd7dc8d4e08e606ceb8fc1a8fab4b5dc57da9305c86bf30c700db7c59"} Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.694584 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="315fe66fd7dc8d4e08e606ceb8fc1a8fab4b5dc57da9305c86bf30c700db7c59" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.694635 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.782649 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2"] Dec 01 09:44:10 crc kubenswrapper[4763]: E1201 09:44:10.783074 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59b383d-c6e5-43da-8066-484e944a3ea8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.783099 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59b383d-c6e5-43da-8066-484e944a3ea8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.783324 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59b383d-c6e5-43da-8066-484e944a3ea8" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.784097 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.789941 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.790027 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.790224 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.790318 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.796175 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2"] Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.911397 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7ldw\" (UniqueName: \"kubernetes.io/projected/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-kube-api-access-j7ldw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.911449 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.911595 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:10 crc kubenswrapper[4763]: I1201 09:44:10.994047 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:44:10 crc kubenswrapper[4763]: E1201 09:44:10.994285 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.005436 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652820eb-87dd-4c77-bef1-1bcd7e68fdf5" path="/var/lib/kubelet/pods/652820eb-87dd-4c77-bef1-1bcd7e68fdf5/volumes" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.013199 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.013305 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.013412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7ldw\" (UniqueName: \"kubernetes.io/projected/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-kube-api-access-j7ldw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.017491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.019088 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.039632 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7ldw\" (UniqueName: \"kubernetes.io/projected/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-kube-api-access-j7ldw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.115784 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.650790 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2"] Dec 01 09:44:11 crc kubenswrapper[4763]: I1201 09:44:11.708876 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" event={"ID":"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb","Type":"ContainerStarted","Data":"11630ad41b0ad603a55399eef68fe0ad364b550503d1a5f6c4baa12cfd7be8be"} Dec 01 09:44:12 crc kubenswrapper[4763]: I1201 09:44:12.718592 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" event={"ID":"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb","Type":"ContainerStarted","Data":"af6d94ecfb66a1908fd633d6ef168f9b2083f29a5fee46489319682d3fcd12e7"} Dec 01 09:44:12 crc kubenswrapper[4763]: I1201 09:44:12.736105 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" podStartSLOduration=2.173593427 podStartE2EDuration="2.73608498s" podCreationTimestamp="2025-12-01 09:44:10 +0000 UTC" firstStartedPulling="2025-12-01 09:44:11.663021482 +0000 UTC m=+1768.931670240" lastFinishedPulling="2025-12-01 09:44:12.225512985 +0000 UTC m=+1769.494161793" observedRunningTime="2025-12-01 09:44:12.732405759 +0000 UTC m=+1770.001054527" watchObservedRunningTime="2025-12-01 09:44:12.73608498 +0000 UTC m=+1770.004733748" Dec 01 09:44:15 crc kubenswrapper[4763]: I1201 09:44:15.043149 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vh7wz"] Dec 01 09:44:15 crc kubenswrapper[4763]: I1201 09:44:15.054906 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vh7wz"] Dec 01 09:44:17 crc kubenswrapper[4763]: I1201 09:44:17.005772 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce04c2cb-ee0d-4530-8007-a853f1d4e785" path="/var/lib/kubelet/pods/ce04c2cb-ee0d-4530-8007-a853f1d4e785/volumes" Dec 01 09:44:24 crc kubenswrapper[4763]: I1201 09:44:24.994268 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:44:24 crc kubenswrapper[4763]: E1201 09:44:24.995810 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:44:35 crc kubenswrapper[4763]: I1201 09:44:35.994029 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:44:35 crc kubenswrapper[4763]: E1201 09:44:35.994816 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:44:48 crc kubenswrapper[4763]: I1201 09:44:48.994850 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:44:48 crc kubenswrapper[4763]: E1201 09:44:48.995821 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.058620 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vrlcm"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.072049 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fmhwj"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.128871 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fd25-account-create-update-2qvtq"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.132671 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-sbf6n"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.145822 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6992-account-create-update-g66rw"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.159195 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3d32-account-create-update-shdxp"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.170329 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vrlcm"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.183152 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6992-account-create-update-g66rw"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.190693 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fd25-account-create-update-2qvtq"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.202437 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3d32-account-create-update-shdxp"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.210884 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fmhwj"] Dec 01 09:44:56 crc kubenswrapper[4763]: I1201 09:44:56.219008 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-sbf6n"] Dec 01 09:44:57 crc kubenswrapper[4763]: I1201 09:44:57.004153 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="445a892a-304f-4d8b-9163-78f5a1011a53" path="/var/lib/kubelet/pods/445a892a-304f-4d8b-9163-78f5a1011a53/volumes" Dec 01 09:44:57 crc kubenswrapper[4763]: I1201 09:44:57.004746 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca" path="/var/lib/kubelet/pods/6e76ebbd-14ae-4886-a9b4-2bf369c2e0ca/volumes" Dec 01 09:44:57 crc kubenswrapper[4763]: I1201 09:44:57.005269 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ff2701-69e8-4c30-a9cd-76a9862849e3" path="/var/lib/kubelet/pods/83ff2701-69e8-4c30-a9cd-76a9862849e3/volumes" Dec 01 09:44:57 crc kubenswrapper[4763]: I1201 09:44:57.005833 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93979ddc-1fca-4d13-bad7-7123ec597957" path="/var/lib/kubelet/pods/93979ddc-1fca-4d13-bad7-7123ec597957/volumes" Dec 01 09:44:57 crc kubenswrapper[4763]: I1201 09:44:57.006859 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70433ec-65b9-46db-8068-4283eb38245a" path="/var/lib/kubelet/pods/a70433ec-65b9-46db-8068-4283eb38245a/volumes" Dec 01 09:44:57 crc kubenswrapper[4763]: I1201 09:44:57.007585 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45c06ea-1ae1-4524-a3a5-704562d6aaab" path="/var/lib/kubelet/pods/b45c06ea-1ae1-4524-a3a5-704562d6aaab/volumes" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.166181 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt"] Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.168653 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.172825 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.173029 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.193358 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt"] Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.248263 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v7ck\" (UniqueName: \"kubernetes.io/projected/22700f51-62ff-4238-9696-72f1446882f0-kube-api-access-6v7ck\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.248345 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22700f51-62ff-4238-9696-72f1446882f0-config-volume\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.255436 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22700f51-62ff-4238-9696-72f1446882f0-secret-volume\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.357261 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22700f51-62ff-4238-9696-72f1446882f0-secret-volume\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.357361 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v7ck\" (UniqueName: \"kubernetes.io/projected/22700f51-62ff-4238-9696-72f1446882f0-kube-api-access-6v7ck\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.357386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22700f51-62ff-4238-9696-72f1446882f0-config-volume\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.358373 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22700f51-62ff-4238-9696-72f1446882f0-config-volume\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.366107 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22700f51-62ff-4238-9696-72f1446882f0-secret-volume\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.378076 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v7ck\" (UniqueName: \"kubernetes.io/projected/22700f51-62ff-4238-9696-72f1446882f0-kube-api-access-6v7ck\") pod \"collect-profiles-29409705-z29tt\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.494035 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:00 crc kubenswrapper[4763]: I1201 09:45:00.981038 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt"] Dec 01 09:45:01 crc kubenswrapper[4763]: I1201 09:45:01.127248 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" event={"ID":"22700f51-62ff-4238-9696-72f1446882f0","Type":"ContainerStarted","Data":"9c9485075408d0bef14ce7787304b73e9fb538a7fe7c0e37c313d7fd823e2d9f"} Dec 01 09:45:01 crc kubenswrapper[4763]: I1201 09:45:01.994533 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:45:01 crc kubenswrapper[4763]: E1201 09:45:01.995032 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:45:02 crc kubenswrapper[4763]: I1201 09:45:02.137545 4763 generic.go:334] "Generic (PLEG): container finished" podID="22700f51-62ff-4238-9696-72f1446882f0" containerID="a7a96ec71d3abef637dab6440cbf0a9db6f83af1bfec3ed5c8a9ec00b6a4d184" exitCode=0 Dec 01 09:45:02 crc kubenswrapper[4763]: I1201 09:45:02.137589 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" event={"ID":"22700f51-62ff-4238-9696-72f1446882f0","Type":"ContainerDied","Data":"a7a96ec71d3abef637dab6440cbf0a9db6f83af1bfec3ed5c8a9ec00b6a4d184"} Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.472104 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.610886 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22700f51-62ff-4238-9696-72f1446882f0-config-volume\") pod \"22700f51-62ff-4238-9696-72f1446882f0\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.611084 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v7ck\" (UniqueName: \"kubernetes.io/projected/22700f51-62ff-4238-9696-72f1446882f0-kube-api-access-6v7ck\") pod \"22700f51-62ff-4238-9696-72f1446882f0\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.611213 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22700f51-62ff-4238-9696-72f1446882f0-secret-volume\") pod \"22700f51-62ff-4238-9696-72f1446882f0\" (UID: \"22700f51-62ff-4238-9696-72f1446882f0\") " Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.612339 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22700f51-62ff-4238-9696-72f1446882f0-config-volume" (OuterVolumeSpecName: "config-volume") pod "22700f51-62ff-4238-9696-72f1446882f0" (UID: "22700f51-62ff-4238-9696-72f1446882f0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.617435 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22700f51-62ff-4238-9696-72f1446882f0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "22700f51-62ff-4238-9696-72f1446882f0" (UID: "22700f51-62ff-4238-9696-72f1446882f0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.627256 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22700f51-62ff-4238-9696-72f1446882f0-kube-api-access-6v7ck" (OuterVolumeSpecName: "kube-api-access-6v7ck") pod "22700f51-62ff-4238-9696-72f1446882f0" (UID: "22700f51-62ff-4238-9696-72f1446882f0"). InnerVolumeSpecName "kube-api-access-6v7ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.714359 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v7ck\" (UniqueName: \"kubernetes.io/projected/22700f51-62ff-4238-9696-72f1446882f0-kube-api-access-6v7ck\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.714397 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22700f51-62ff-4238-9696-72f1446882f0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.714411 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22700f51-62ff-4238-9696-72f1446882f0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.748695 4763 scope.go:117] "RemoveContainer" containerID="2c84d8c48283604e169a910c1b0a8ab513166332aec0973068d293fbd60f89a0" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.772048 4763 scope.go:117] "RemoveContainer" containerID="0d5e1d579e04c894d2d13ec348b2bc0009855e6f729e9c0b4c6b59fb87e3c77b" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.809720 4763 scope.go:117] "RemoveContainer" containerID="8ffa6b2dce43b3a67ba79e7c7ee917d1ee811d6804bbf8fb3c9a7a9f6d842d74" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.830950 4763 scope.go:117] "RemoveContainer" containerID="adf6a0703a06a8706c6f697716a2a1af6932a0bde37a76269af46c5a94b63797" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.859811 4763 scope.go:117] "RemoveContainer" containerID="060e281bc4aca80a2cc3bf1915e50c13eaaf69d34ba12237d8d225843aadee2a" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.879952 4763 scope.go:117] "RemoveContainer" containerID="8d2459adeba909601ab768047c986e689580ce0271262b8f4a8796ed4dd3b86a" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.906368 4763 scope.go:117] "RemoveContainer" containerID="8eb18e37b6d89aa6c37b819e96bfcb945a3bfbf5d3ea26522115ef0f65f1e8a3" Dec 01 09:45:03 crc kubenswrapper[4763]: I1201 09:45:03.931306 4763 scope.go:117] "RemoveContainer" containerID="4c938982466fb132cad880ab6c290acd16f9ee48487cdeee008ca5f4196ac52a" Dec 01 09:45:04 crc kubenswrapper[4763]: I1201 09:45:04.153264 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" event={"ID":"22700f51-62ff-4238-9696-72f1446882f0","Type":"ContainerDied","Data":"9c9485075408d0bef14ce7787304b73e9fb538a7fe7c0e37c313d7fd823e2d9f"} Dec 01 09:45:04 crc kubenswrapper[4763]: I1201 09:45:04.153624 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c9485075408d0bef14ce7787304b73e9fb538a7fe7c0e37c313d7fd823e2d9f" Dec 01 09:45:04 crc kubenswrapper[4763]: I1201 09:45:04.153315 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt" Dec 01 09:45:08 crc kubenswrapper[4763]: I1201 09:45:08.185671 4763 generic.go:334] "Generic (PLEG): container finished" podID="cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb" containerID="af6d94ecfb66a1908fd633d6ef168f9b2083f29a5fee46489319682d3fcd12e7" exitCode=0 Dec 01 09:45:08 crc kubenswrapper[4763]: I1201 09:45:08.185836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" event={"ID":"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb","Type":"ContainerDied","Data":"af6d94ecfb66a1908fd633d6ef168f9b2083f29a5fee46489319682d3fcd12e7"} Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.020562 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.127615 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-ssh-key\") pod \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.128011 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7ldw\" (UniqueName: \"kubernetes.io/projected/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-kube-api-access-j7ldw\") pod \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.128275 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-inventory\") pod \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\" (UID: \"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb\") " Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.133149 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-kube-api-access-j7ldw" (OuterVolumeSpecName: "kube-api-access-j7ldw") pod "cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb" (UID: "cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb"). InnerVolumeSpecName "kube-api-access-j7ldw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.161400 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb" (UID: "cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.162669 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-inventory" (OuterVolumeSpecName: "inventory") pod "cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb" (UID: "cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.206802 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" event={"ID":"cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb","Type":"ContainerDied","Data":"11630ad41b0ad603a55399eef68fe0ad364b550503d1a5f6c4baa12cfd7be8be"} Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.206847 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11630ad41b0ad603a55399eef68fe0ad364b550503d1a5f6c4baa12cfd7be8be" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.206910 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.231333 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.231362 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.231374 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7ldw\" (UniqueName: \"kubernetes.io/projected/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb-kube-api-access-j7ldw\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.275884 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hz9sr"] Dec 01 09:45:10 crc kubenswrapper[4763]: E1201 09:45:10.276375 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.276400 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:10 crc kubenswrapper[4763]: E1201 09:45:10.276426 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22700f51-62ff-4238-9696-72f1446882f0" containerName="collect-profiles" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.276432 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="22700f51-62ff-4238-9696-72f1446882f0" containerName="collect-profiles" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.276611 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.276633 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="22700f51-62ff-4238-9696-72f1446882f0" containerName="collect-profiles" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.277190 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.279363 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.283555 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.283573 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.284018 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.286529 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hz9sr"] Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.434856 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc9k4\" (UniqueName: \"kubernetes.io/projected/e66619ba-16d0-4218-bf3a-652bf97bdcce-kube-api-access-nc9k4\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.435033 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.435145 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.536897 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.537067 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.537154 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc9k4\" (UniqueName: \"kubernetes.io/projected/e66619ba-16d0-4218-bf3a-652bf97bdcce-kube-api-access-nc9k4\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.541510 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.546158 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.562175 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc9k4\" (UniqueName: \"kubernetes.io/projected/e66619ba-16d0-4218-bf3a-652bf97bdcce-kube-api-access-nc9k4\") pod \"ssh-known-hosts-edpm-deployment-hz9sr\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:10 crc kubenswrapper[4763]: I1201 09:45:10.614984 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:11 crc kubenswrapper[4763]: I1201 09:45:11.147041 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hz9sr"] Dec 01 09:45:11 crc kubenswrapper[4763]: I1201 09:45:11.215283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" event={"ID":"e66619ba-16d0-4218-bf3a-652bf97bdcce","Type":"ContainerStarted","Data":"8c28aa0a493eeddec5bc567dc38a064e3f7cb25b3644c9f257af0034af3dd5b8"} Dec 01 09:45:12 crc kubenswrapper[4763]: I1201 09:45:12.225356 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" event={"ID":"e66619ba-16d0-4218-bf3a-652bf97bdcce","Type":"ContainerStarted","Data":"5bc222005413f0654dd2a17f3cce34a9e960b5c4004cc5e7fb9a3dbea40cbf19"} Dec 01 09:45:12 crc kubenswrapper[4763]: I1201 09:45:12.242630 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" podStartSLOduration=1.733418921 podStartE2EDuration="2.242614703s" podCreationTimestamp="2025-12-01 09:45:10 +0000 UTC" firstStartedPulling="2025-12-01 09:45:11.13932335 +0000 UTC m=+1828.407972118" lastFinishedPulling="2025-12-01 09:45:11.648519132 +0000 UTC m=+1828.917167900" observedRunningTime="2025-12-01 09:45:12.237325577 +0000 UTC m=+1829.505974355" watchObservedRunningTime="2025-12-01 09:45:12.242614703 +0000 UTC m=+1829.511263471" Dec 01 09:45:13 crc kubenswrapper[4763]: I1201 09:45:13.994051 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:45:13 crc kubenswrapper[4763]: E1201 09:45:13.994700 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:45:20 crc kubenswrapper[4763]: I1201 09:45:20.286136 4763 generic.go:334] "Generic (PLEG): container finished" podID="e66619ba-16d0-4218-bf3a-652bf97bdcce" containerID="5bc222005413f0654dd2a17f3cce34a9e960b5c4004cc5e7fb9a3dbea40cbf19" exitCode=0 Dec 01 09:45:20 crc kubenswrapper[4763]: I1201 09:45:20.286184 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" event={"ID":"e66619ba-16d0-4218-bf3a-652bf97bdcce","Type":"ContainerDied","Data":"5bc222005413f0654dd2a17f3cce34a9e960b5c4004cc5e7fb9a3dbea40cbf19"} Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.678923 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.845944 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc9k4\" (UniqueName: \"kubernetes.io/projected/e66619ba-16d0-4218-bf3a-652bf97bdcce-kube-api-access-nc9k4\") pod \"e66619ba-16d0-4218-bf3a-652bf97bdcce\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.846193 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-ssh-key-openstack-edpm-ipam\") pod \"e66619ba-16d0-4218-bf3a-652bf97bdcce\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.846314 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-inventory-0\") pod \"e66619ba-16d0-4218-bf3a-652bf97bdcce\" (UID: \"e66619ba-16d0-4218-bf3a-652bf97bdcce\") " Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.854512 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66619ba-16d0-4218-bf3a-652bf97bdcce-kube-api-access-nc9k4" (OuterVolumeSpecName: "kube-api-access-nc9k4") pod "e66619ba-16d0-4218-bf3a-652bf97bdcce" (UID: "e66619ba-16d0-4218-bf3a-652bf97bdcce"). InnerVolumeSpecName "kube-api-access-nc9k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.872852 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e66619ba-16d0-4218-bf3a-652bf97bdcce" (UID: "e66619ba-16d0-4218-bf3a-652bf97bdcce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.880529 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e66619ba-16d0-4218-bf3a-652bf97bdcce" (UID: "e66619ba-16d0-4218-bf3a-652bf97bdcce"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.948211 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.948428 4763 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e66619ba-16d0-4218-bf3a-652bf97bdcce-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:21 crc kubenswrapper[4763]: I1201 09:45:21.948556 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc9k4\" (UniqueName: \"kubernetes.io/projected/e66619ba-16d0-4218-bf3a-652bf97bdcce-kube-api-access-nc9k4\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.308859 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" event={"ID":"e66619ba-16d0-4218-bf3a-652bf97bdcce","Type":"ContainerDied","Data":"8c28aa0a493eeddec5bc567dc38a064e3f7cb25b3644c9f257af0034af3dd5b8"} Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.309220 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c28aa0a493eeddec5bc567dc38a064e3f7cb25b3644c9f257af0034af3dd5b8" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.309291 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hz9sr" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.368572 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc"] Dec 01 09:45:22 crc kubenswrapper[4763]: E1201 09:45:22.368920 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66619ba-16d0-4218-bf3a-652bf97bdcce" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.368936 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66619ba-16d0-4218-bf3a-652bf97bdcce" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.369192 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66619ba-16d0-4218-bf3a-652bf97bdcce" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.369829 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.371554 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.371762 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.371947 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.372139 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.410264 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc"] Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.457425 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.457509 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwfg\" (UniqueName: \"kubernetes.io/projected/f723fec0-c702-47aa-bf56-ea502c4ce783-kube-api-access-njwfg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.457552 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.558555 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.558617 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwfg\" (UniqueName: \"kubernetes.io/projected/f723fec0-c702-47aa-bf56-ea502c4ce783-kube-api-access-njwfg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.558643 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.563641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.571117 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.577950 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwfg\" (UniqueName: \"kubernetes.io/projected/f723fec0-c702-47aa-bf56-ea502c4ce783-kube-api-access-njwfg\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5h9kc\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:22 crc kubenswrapper[4763]: I1201 09:45:22.689162 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:23 crc kubenswrapper[4763]: I1201 09:45:23.171513 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc"] Dec 01 09:45:23 crc kubenswrapper[4763]: I1201 09:45:23.317533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" event={"ID":"f723fec0-c702-47aa-bf56-ea502c4ce783","Type":"ContainerStarted","Data":"e8081bb1a691d7e44be891e39ac68c157e33149ffadfffd5eb47c4d01611ee70"} Dec 01 09:45:24 crc kubenswrapper[4763]: I1201 09:45:24.327359 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" event={"ID":"f723fec0-c702-47aa-bf56-ea502c4ce783","Type":"ContainerStarted","Data":"d89e372174d4a35a7ed139f057cb2cc71d9b1c04dbee1ce33f0c43f20a50d2c0"} Dec 01 09:45:24 crc kubenswrapper[4763]: I1201 09:45:24.350046 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" podStartSLOduration=1.590819821 podStartE2EDuration="2.350028422s" podCreationTimestamp="2025-12-01 09:45:22 +0000 UTC" firstStartedPulling="2025-12-01 09:45:23.175785754 +0000 UTC m=+1840.444434522" lastFinishedPulling="2025-12-01 09:45:23.934994355 +0000 UTC m=+1841.203643123" observedRunningTime="2025-12-01 09:45:24.341426725 +0000 UTC m=+1841.610075493" watchObservedRunningTime="2025-12-01 09:45:24.350028422 +0000 UTC m=+1841.618677190" Dec 01 09:45:25 crc kubenswrapper[4763]: I1201 09:45:25.994806 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:45:25 crc kubenswrapper[4763]: E1201 09:45:25.995340 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:45:26 crc kubenswrapper[4763]: I1201 09:45:26.041176 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b92hz"] Dec 01 09:45:26 crc kubenswrapper[4763]: I1201 09:45:26.051277 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-b92hz"] Dec 01 09:45:27 crc kubenswrapper[4763]: I1201 09:45:27.006657 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da701d9-7efc-4ae7-bcfe-eeed1e7312a2" path="/var/lib/kubelet/pods/2da701d9-7efc-4ae7-bcfe-eeed1e7312a2/volumes" Dec 01 09:45:33 crc kubenswrapper[4763]: I1201 09:45:33.421157 4763 generic.go:334] "Generic (PLEG): container finished" podID="f723fec0-c702-47aa-bf56-ea502c4ce783" containerID="d89e372174d4a35a7ed139f057cb2cc71d9b1c04dbee1ce33f0c43f20a50d2c0" exitCode=0 Dec 01 09:45:33 crc kubenswrapper[4763]: I1201 09:45:33.421238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" event={"ID":"f723fec0-c702-47aa-bf56-ea502c4ce783","Type":"ContainerDied","Data":"d89e372174d4a35a7ed139f057cb2cc71d9b1c04dbee1ce33f0c43f20a50d2c0"} Dec 01 09:45:34 crc kubenswrapper[4763]: I1201 09:45:34.830729 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:34 crc kubenswrapper[4763]: I1201 09:45:34.985121 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-inventory\") pod \"f723fec0-c702-47aa-bf56-ea502c4ce783\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " Dec 01 09:45:34 crc kubenswrapper[4763]: I1201 09:45:34.985298 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-ssh-key\") pod \"f723fec0-c702-47aa-bf56-ea502c4ce783\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " Dec 01 09:45:34 crc kubenswrapper[4763]: I1201 09:45:34.985571 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njwfg\" (UniqueName: \"kubernetes.io/projected/f723fec0-c702-47aa-bf56-ea502c4ce783-kube-api-access-njwfg\") pod \"f723fec0-c702-47aa-bf56-ea502c4ce783\" (UID: \"f723fec0-c702-47aa-bf56-ea502c4ce783\") " Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.001710 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f723fec0-c702-47aa-bf56-ea502c4ce783-kube-api-access-njwfg" (OuterVolumeSpecName: "kube-api-access-njwfg") pod "f723fec0-c702-47aa-bf56-ea502c4ce783" (UID: "f723fec0-c702-47aa-bf56-ea502c4ce783"). InnerVolumeSpecName "kube-api-access-njwfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.016430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-inventory" (OuterVolumeSpecName: "inventory") pod "f723fec0-c702-47aa-bf56-ea502c4ce783" (UID: "f723fec0-c702-47aa-bf56-ea502c4ce783"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.017782 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f723fec0-c702-47aa-bf56-ea502c4ce783" (UID: "f723fec0-c702-47aa-bf56-ea502c4ce783"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.087528 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.087725 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njwfg\" (UniqueName: \"kubernetes.io/projected/f723fec0-c702-47aa-bf56-ea502c4ce783-kube-api-access-njwfg\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.087833 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f723fec0-c702-47aa-bf56-ea502c4ce783-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.440614 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" event={"ID":"f723fec0-c702-47aa-bf56-ea502c4ce783","Type":"ContainerDied","Data":"e8081bb1a691d7e44be891e39ac68c157e33149ffadfffd5eb47c4d01611ee70"} Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.440661 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8081bb1a691d7e44be891e39ac68c157e33149ffadfffd5eb47c4d01611ee70" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.440668 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.521323 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc"] Dec 01 09:45:35 crc kubenswrapper[4763]: E1201 09:45:35.521980 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f723fec0-c702-47aa-bf56-ea502c4ce783" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.522006 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f723fec0-c702-47aa-bf56-ea502c4ce783" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.522266 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f723fec0-c702-47aa-bf56-ea502c4ce783" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.523052 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.530500 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.531353 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.531814 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.541908 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc"] Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.549392 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.700720 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25v4t\" (UniqueName: \"kubernetes.io/projected/52acb043-844e-4401-8453-075ba1a4c174-kube-api-access-25v4t\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.700811 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.700883 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.803155 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.803324 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25v4t\" (UniqueName: \"kubernetes.io/projected/52acb043-844e-4401-8453-075ba1a4c174-kube-api-access-25v4t\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.803389 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.808931 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.809069 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.826129 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25v4t\" (UniqueName: \"kubernetes.io/projected/52acb043-844e-4401-8453-075ba1a4c174-kube-api-access-25v4t\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:35 crc kubenswrapper[4763]: I1201 09:45:35.847343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:36 crc kubenswrapper[4763]: I1201 09:45:36.361121 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc"] Dec 01 09:45:36 crc kubenswrapper[4763]: I1201 09:45:36.448710 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" event={"ID":"52acb043-844e-4401-8453-075ba1a4c174","Type":"ContainerStarted","Data":"a9c0edf123b007cda5bdadc8f9c2ab7cf66199b1e25b7b3a65815995d1f7d258"} Dec 01 09:45:37 crc kubenswrapper[4763]: I1201 09:45:37.461992 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" event={"ID":"52acb043-844e-4401-8453-075ba1a4c174","Type":"ContainerStarted","Data":"0170caf3836ae474cbda75d254d35da72aeaffe23ef3589ece01c9157b2dee2d"} Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.089100 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" podStartSLOduration=2.64036984 podStartE2EDuration="3.089083595s" podCreationTimestamp="2025-12-01 09:45:35 +0000 UTC" firstStartedPulling="2025-12-01 09:45:36.364381469 +0000 UTC m=+1853.633030237" lastFinishedPulling="2025-12-01 09:45:36.813095224 +0000 UTC m=+1854.081743992" observedRunningTime="2025-12-01 09:45:37.49122229 +0000 UTC m=+1854.759871058" watchObservedRunningTime="2025-12-01 09:45:38.089083595 +0000 UTC m=+1855.357732363" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.091352 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cblm8"] Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.093731 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.116848 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cblm8"] Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.253264 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-utilities\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.253341 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-catalog-content\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.253395 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tbgf\" (UniqueName: \"kubernetes.io/projected/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-kube-api-access-9tbgf\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.355029 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-utilities\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.355450 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-catalog-content\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.355671 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tbgf\" (UniqueName: \"kubernetes.io/projected/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-kube-api-access-9tbgf\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.355488 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-utilities\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.355729 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-catalog-content\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.377995 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tbgf\" (UniqueName: \"kubernetes.io/projected/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-kube-api-access-9tbgf\") pod \"redhat-operators-cblm8\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.412062 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:38 crc kubenswrapper[4763]: I1201 09:45:38.904949 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cblm8"] Dec 01 09:45:39 crc kubenswrapper[4763]: I1201 09:45:39.482682 4763 generic.go:334] "Generic (PLEG): container finished" podID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerID="5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d" exitCode=0 Dec 01 09:45:39 crc kubenswrapper[4763]: I1201 09:45:39.482917 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cblm8" event={"ID":"f01f3219-528c-4813-aa4b-7ae0f02f8e2a","Type":"ContainerDied","Data":"5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d"} Dec 01 09:45:39 crc kubenswrapper[4763]: I1201 09:45:39.482982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cblm8" event={"ID":"f01f3219-528c-4813-aa4b-7ae0f02f8e2a","Type":"ContainerStarted","Data":"ac453debe0380fd8715a9b90c4a0aa9803cca296efa554252de395c9f287e864"} Dec 01 09:45:40 crc kubenswrapper[4763]: I1201 09:45:40.994003 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:45:41 crc kubenswrapper[4763]: E1201 09:45:40.994642 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:45:41 crc kubenswrapper[4763]: I1201 09:45:41.499828 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cblm8" event={"ID":"f01f3219-528c-4813-aa4b-7ae0f02f8e2a","Type":"ContainerStarted","Data":"1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d"} Dec 01 09:45:45 crc kubenswrapper[4763]: I1201 09:45:45.531582 4763 generic.go:334] "Generic (PLEG): container finished" podID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerID="1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d" exitCode=0 Dec 01 09:45:45 crc kubenswrapper[4763]: I1201 09:45:45.531685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cblm8" event={"ID":"f01f3219-528c-4813-aa4b-7ae0f02f8e2a","Type":"ContainerDied","Data":"1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d"} Dec 01 09:45:46 crc kubenswrapper[4763]: I1201 09:45:46.543641 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cblm8" event={"ID":"f01f3219-528c-4813-aa4b-7ae0f02f8e2a","Type":"ContainerStarted","Data":"eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01"} Dec 01 09:45:47 crc kubenswrapper[4763]: E1201 09:45:47.444228 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52acb043_844e_4401_8453_075ba1a4c174.slice/crio-conmon-0170caf3836ae474cbda75d254d35da72aeaffe23ef3589ece01c9157b2dee2d.scope\": RecentStats: unable to find data in memory cache]" Dec 01 09:45:47 crc kubenswrapper[4763]: I1201 09:45:47.553252 4763 generic.go:334] "Generic (PLEG): container finished" podID="52acb043-844e-4401-8453-075ba1a4c174" containerID="0170caf3836ae474cbda75d254d35da72aeaffe23ef3589ece01c9157b2dee2d" exitCode=0 Dec 01 09:45:47 crc kubenswrapper[4763]: I1201 09:45:47.553303 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" event={"ID":"52acb043-844e-4401-8453-075ba1a4c174","Type":"ContainerDied","Data":"0170caf3836ae474cbda75d254d35da72aeaffe23ef3589ece01c9157b2dee2d"} Dec 01 09:45:47 crc kubenswrapper[4763]: I1201 09:45:47.576566 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cblm8" podStartSLOduration=3.098394462 podStartE2EDuration="9.576546348s" podCreationTimestamp="2025-12-01 09:45:38 +0000 UTC" firstStartedPulling="2025-12-01 09:45:39.485372812 +0000 UTC m=+1856.754021580" lastFinishedPulling="2025-12-01 09:45:45.963524698 +0000 UTC m=+1863.232173466" observedRunningTime="2025-12-01 09:45:46.568999363 +0000 UTC m=+1863.837648131" watchObservedRunningTime="2025-12-01 09:45:47.576546348 +0000 UTC m=+1864.845195116" Dec 01 09:45:48 crc kubenswrapper[4763]: I1201 09:45:48.412539 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:48 crc kubenswrapper[4763]: I1201 09:45:48.412596 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.006143 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.069737 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-ssh-key\") pod \"52acb043-844e-4401-8453-075ba1a4c174\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.069781 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25v4t\" (UniqueName: \"kubernetes.io/projected/52acb043-844e-4401-8453-075ba1a4c174-kube-api-access-25v4t\") pod \"52acb043-844e-4401-8453-075ba1a4c174\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.069888 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-inventory\") pod \"52acb043-844e-4401-8453-075ba1a4c174\" (UID: \"52acb043-844e-4401-8453-075ba1a4c174\") " Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.095648 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52acb043-844e-4401-8453-075ba1a4c174-kube-api-access-25v4t" (OuterVolumeSpecName: "kube-api-access-25v4t") pod "52acb043-844e-4401-8453-075ba1a4c174" (UID: "52acb043-844e-4401-8453-075ba1a4c174"). InnerVolumeSpecName "kube-api-access-25v4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.109549 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-inventory" (OuterVolumeSpecName: "inventory") pod "52acb043-844e-4401-8453-075ba1a4c174" (UID: "52acb043-844e-4401-8453-075ba1a4c174"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.122265 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "52acb043-844e-4401-8453-075ba1a4c174" (UID: "52acb043-844e-4401-8453-075ba1a4c174"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.172327 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.172357 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25v4t\" (UniqueName: \"kubernetes.io/projected/52acb043-844e-4401-8453-075ba1a4c174-kube-api-access-25v4t\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.172368 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/52acb043-844e-4401-8453-075ba1a4c174-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.458890 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cblm8" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="registry-server" probeResult="failure" output=< Dec 01 09:45:49 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:45:49 crc kubenswrapper[4763]: > Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.577615 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" event={"ID":"52acb043-844e-4401-8453-075ba1a4c174","Type":"ContainerDied","Data":"a9c0edf123b007cda5bdadc8f9c2ab7cf66199b1e25b7b3a65815995d1f7d258"} Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.577899 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9c0edf123b007cda5bdadc8f9c2ab7cf66199b1e25b7b3a65815995d1f7d258" Dec 01 09:45:49 crc kubenswrapper[4763]: I1201 09:45:49.577819 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc" Dec 01 09:45:50 crc kubenswrapper[4763]: I1201 09:45:50.034789 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g9pwc"] Dec 01 09:45:50 crc kubenswrapper[4763]: I1201 09:45:50.042830 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dfdsm"] Dec 01 09:45:50 crc kubenswrapper[4763]: I1201 09:45:50.050511 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-g9pwc"] Dec 01 09:45:50 crc kubenswrapper[4763]: I1201 09:45:50.059786 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dfdsm"] Dec 01 09:45:51 crc kubenswrapper[4763]: I1201 09:45:51.006356 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a198f66c-6f38-4b84-b2bd-898f00d40932" path="/var/lib/kubelet/pods/a198f66c-6f38-4b84-b2bd-898f00d40932/volumes" Dec 01 09:45:51 crc kubenswrapper[4763]: I1201 09:45:51.007595 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b66291-ad35-4f2b-b21c-da8de9a419d2" path="/var/lib/kubelet/pods/d0b66291-ad35-4f2b-b21c-da8de9a419d2/volumes" Dec 01 09:45:54 crc kubenswrapper[4763]: I1201 09:45:54.994151 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:45:54 crc kubenswrapper[4763]: E1201 09:45:54.994919 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:45:58 crc kubenswrapper[4763]: I1201 09:45:58.461573 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:58 crc kubenswrapper[4763]: I1201 09:45:58.517317 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:45:58 crc kubenswrapper[4763]: I1201 09:45:58.697240 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cblm8"] Dec 01 09:45:59 crc kubenswrapper[4763]: I1201 09:45:59.654964 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cblm8" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="registry-server" containerID="cri-o://eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01" gracePeriod=2 Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.076644 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.117861 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-catalog-content\") pod \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.117918 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-utilities\") pod \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.117956 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tbgf\" (UniqueName: \"kubernetes.io/projected/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-kube-api-access-9tbgf\") pod \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\" (UID: \"f01f3219-528c-4813-aa4b-7ae0f02f8e2a\") " Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.119355 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-utilities" (OuterVolumeSpecName: "utilities") pod "f01f3219-528c-4813-aa4b-7ae0f02f8e2a" (UID: "f01f3219-528c-4813-aa4b-7ae0f02f8e2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.124130 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-kube-api-access-9tbgf" (OuterVolumeSpecName: "kube-api-access-9tbgf") pod "f01f3219-528c-4813-aa4b-7ae0f02f8e2a" (UID: "f01f3219-528c-4813-aa4b-7ae0f02f8e2a"). InnerVolumeSpecName "kube-api-access-9tbgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.219858 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.219893 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tbgf\" (UniqueName: \"kubernetes.io/projected/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-kube-api-access-9tbgf\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.240829 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f01f3219-528c-4813-aa4b-7ae0f02f8e2a" (UID: "f01f3219-528c-4813-aa4b-7ae0f02f8e2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.321663 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f01f3219-528c-4813-aa4b-7ae0f02f8e2a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.668712 4763 generic.go:334] "Generic (PLEG): container finished" podID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerID="eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01" exitCode=0 Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.668760 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cblm8" event={"ID":"f01f3219-528c-4813-aa4b-7ae0f02f8e2a","Type":"ContainerDied","Data":"eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01"} Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.668797 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cblm8" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.668803 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cblm8" event={"ID":"f01f3219-528c-4813-aa4b-7ae0f02f8e2a","Type":"ContainerDied","Data":"ac453debe0380fd8715a9b90c4a0aa9803cca296efa554252de395c9f287e864"} Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.668827 4763 scope.go:117] "RemoveContainer" containerID="eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.698361 4763 scope.go:117] "RemoveContainer" containerID="1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.718213 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cblm8"] Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.728271 4763 scope.go:117] "RemoveContainer" containerID="5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.735135 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cblm8"] Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.771072 4763 scope.go:117] "RemoveContainer" containerID="eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01" Dec 01 09:46:00 crc kubenswrapper[4763]: E1201 09:46:00.771700 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01\": container with ID starting with eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01 not found: ID does not exist" containerID="eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.771733 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01"} err="failed to get container status \"eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01\": rpc error: code = NotFound desc = could not find container \"eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01\": container with ID starting with eb0560ebaf849b25b31c09e3635b20c765f3eb1cd8630663518ee4ec1605ce01 not found: ID does not exist" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.771755 4763 scope.go:117] "RemoveContainer" containerID="1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d" Dec 01 09:46:00 crc kubenswrapper[4763]: E1201 09:46:00.771968 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d\": container with ID starting with 1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d not found: ID does not exist" containerID="1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.771989 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d"} err="failed to get container status \"1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d\": rpc error: code = NotFound desc = could not find container \"1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d\": container with ID starting with 1180249cb1758a89eab0a8cd3937d6985e593803866568c3183c3a804c93f86d not found: ID does not exist" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.772002 4763 scope.go:117] "RemoveContainer" containerID="5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d" Dec 01 09:46:00 crc kubenswrapper[4763]: E1201 09:46:00.772195 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d\": container with ID starting with 5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d not found: ID does not exist" containerID="5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d" Dec 01 09:46:00 crc kubenswrapper[4763]: I1201 09:46:00.772209 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d"} err="failed to get container status \"5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d\": rpc error: code = NotFound desc = could not find container \"5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d\": container with ID starting with 5363e08b955e6f513975211d3174bb1a78baa0277859a381e0f6bb6be781c47d not found: ID does not exist" Dec 01 09:46:01 crc kubenswrapper[4763]: I1201 09:46:01.005519 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" path="/var/lib/kubelet/pods/f01f3219-528c-4813-aa4b-7ae0f02f8e2a/volumes" Dec 01 09:46:04 crc kubenswrapper[4763]: I1201 09:46:04.090115 4763 scope.go:117] "RemoveContainer" containerID="739b1c44ff01e8daad6a4526a7dab3934bd0772fff7a63a3245992da32155aba" Dec 01 09:46:04 crc kubenswrapper[4763]: I1201 09:46:04.135392 4763 scope.go:117] "RemoveContainer" containerID="0c6926e444bcc6c38baa09d6d0b3aaf27db832b3d984b682bbe88da39600f620" Dec 01 09:46:04 crc kubenswrapper[4763]: I1201 09:46:04.188391 4763 scope.go:117] "RemoveContainer" containerID="fb2b85355691165fcc2f4bc4e6af80c4dc40b760c308d24019ebeb2755f66632" Dec 01 09:46:09 crc kubenswrapper[4763]: I1201 09:46:09.995111 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:46:09 crc kubenswrapper[4763]: E1201 09:46:09.995764 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:46:21 crc kubenswrapper[4763]: I1201 09:46:21.993847 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:46:21 crc kubenswrapper[4763]: E1201 09:46:21.994692 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:46:33 crc kubenswrapper[4763]: I1201 09:46:33.002064 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:46:33 crc kubenswrapper[4763]: E1201 09:46:33.002881 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:46:35 crc kubenswrapper[4763]: I1201 09:46:35.050734 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6nsvb"] Dec 01 09:46:35 crc kubenswrapper[4763]: I1201 09:46:35.057700 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6nsvb"] Dec 01 09:46:37 crc kubenswrapper[4763]: I1201 09:46:37.011053 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af83b033-4df7-4f11-bf61-ab1addfeb933" path="/var/lib/kubelet/pods/af83b033-4df7-4f11-bf61-ab1addfeb933/volumes" Dec 01 09:46:46 crc kubenswrapper[4763]: I1201 09:46:46.994402 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:46:46 crc kubenswrapper[4763]: E1201 09:46:46.995153 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:47:00 crc kubenswrapper[4763]: I1201 09:47:00.994179 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:47:00 crc kubenswrapper[4763]: E1201 09:47:00.995068 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:47:04 crc kubenswrapper[4763]: I1201 09:47:04.290568 4763 scope.go:117] "RemoveContainer" containerID="867ac822b4d99d98ebbfbf598f7e8b3ab7c2cef8ddc81860f77880cbba820179" Dec 01 09:47:14 crc kubenswrapper[4763]: I1201 09:47:14.994337 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:47:14 crc kubenswrapper[4763]: E1201 09:47:14.994996 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:47:29 crc kubenswrapper[4763]: I1201 09:47:29.994293 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:47:29 crc kubenswrapper[4763]: E1201 09:47:29.995035 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:47:41 crc kubenswrapper[4763]: I1201 09:47:41.995050 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:47:42 crc kubenswrapper[4763]: I1201 09:47:42.489011 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"9c17a35b99de2373701725812b0239f382de45e170970d427ea69e40f3be13c9"} Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.212905 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27btm"] Dec 01 09:50:01 crc kubenswrapper[4763]: E1201 09:50:01.214891 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="registry-server" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.215006 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="registry-server" Dec 01 09:50:01 crc kubenswrapper[4763]: E1201 09:50:01.215090 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52acb043-844e-4401-8453-075ba1a4c174" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.215175 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="52acb043-844e-4401-8453-075ba1a4c174" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:50:01 crc kubenswrapper[4763]: E1201 09:50:01.215265 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="extract-content" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.215331 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="extract-content" Dec 01 09:50:01 crc kubenswrapper[4763]: E1201 09:50:01.215407 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="extract-utilities" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.215496 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="extract-utilities" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.215785 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01f3219-528c-4813-aa4b-7ae0f02f8e2a" containerName="registry-server" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.215882 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="52acb043-844e-4401-8453-075ba1a4c174" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.217669 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.237310 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27btm"] Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.323374 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4hp\" (UniqueName: \"kubernetes.io/projected/63468524-5b53-4451-a3e9-180897432dce-kube-api-access-xs4hp\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.323445 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-catalog-content\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.323525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-utilities\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.424838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4hp\" (UniqueName: \"kubernetes.io/projected/63468524-5b53-4451-a3e9-180897432dce-kube-api-access-xs4hp\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.425143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-catalog-content\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.425231 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-utilities\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.425731 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-catalog-content\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.425813 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-utilities\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.458550 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4hp\" (UniqueName: \"kubernetes.io/projected/63468524-5b53-4451-a3e9-180897432dce-kube-api-access-xs4hp\") pod \"redhat-marketplace-27btm\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:01 crc kubenswrapper[4763]: I1201 09:50:01.537874 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:02 crc kubenswrapper[4763]: I1201 09:50:02.081280 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27btm"] Dec 01 09:50:02 crc kubenswrapper[4763]: I1201 09:50:02.690644 4763 generic.go:334] "Generic (PLEG): container finished" podID="63468524-5b53-4451-a3e9-180897432dce" containerID="b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677" exitCode=0 Dec 01 09:50:02 crc kubenswrapper[4763]: I1201 09:50:02.690685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27btm" event={"ID":"63468524-5b53-4451-a3e9-180897432dce","Type":"ContainerDied","Data":"b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677"} Dec 01 09:50:02 crc kubenswrapper[4763]: I1201 09:50:02.690710 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27btm" event={"ID":"63468524-5b53-4451-a3e9-180897432dce","Type":"ContainerStarted","Data":"a7320002f091eaee81b9beddb18757b3698a4df07f411f3e419658bb37401fec"} Dec 01 09:50:02 crc kubenswrapper[4763]: I1201 09:50:02.693246 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:50:03 crc kubenswrapper[4763]: I1201 09:50:03.701401 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27btm" event={"ID":"63468524-5b53-4451-a3e9-180897432dce","Type":"ContainerStarted","Data":"eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db"} Dec 01 09:50:03 crc kubenswrapper[4763]: I1201 09:50:03.928997 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:50:03 crc kubenswrapper[4763]: I1201 09:50:03.929063 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:50:04 crc kubenswrapper[4763]: I1201 09:50:04.715900 4763 generic.go:334] "Generic (PLEG): container finished" podID="63468524-5b53-4451-a3e9-180897432dce" containerID="eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db" exitCode=0 Dec 01 09:50:04 crc kubenswrapper[4763]: I1201 09:50:04.715953 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27btm" event={"ID":"63468524-5b53-4451-a3e9-180897432dce","Type":"ContainerDied","Data":"eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db"} Dec 01 09:50:05 crc kubenswrapper[4763]: I1201 09:50:05.788245 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27btm" event={"ID":"63468524-5b53-4451-a3e9-180897432dce","Type":"ContainerStarted","Data":"73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676"} Dec 01 09:50:05 crc kubenswrapper[4763]: I1201 09:50:05.819038 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27btm" podStartSLOduration=2.252917003 podStartE2EDuration="4.819021555s" podCreationTimestamp="2025-12-01 09:50:01 +0000 UTC" firstStartedPulling="2025-12-01 09:50:02.692987121 +0000 UTC m=+2119.961635889" lastFinishedPulling="2025-12-01 09:50:05.259091673 +0000 UTC m=+2122.527740441" observedRunningTime="2025-12-01 09:50:05.815730955 +0000 UTC m=+2123.084379723" watchObservedRunningTime="2025-12-01 09:50:05.819021555 +0000 UTC m=+2123.087670323" Dec 01 09:50:11 crc kubenswrapper[4763]: I1201 09:50:11.538934 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:11 crc kubenswrapper[4763]: I1201 09:50:11.540041 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:11 crc kubenswrapper[4763]: I1201 09:50:11.603884 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:11 crc kubenswrapper[4763]: I1201 09:50:11.887428 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:11 crc kubenswrapper[4763]: I1201 09:50:11.933271 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27btm"] Dec 01 09:50:13 crc kubenswrapper[4763]: I1201 09:50:13.850328 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27btm" podUID="63468524-5b53-4451-a3e9-180897432dce" containerName="registry-server" containerID="cri-o://73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676" gracePeriod=2 Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.351249 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.505609 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-utilities\") pod \"63468524-5b53-4451-a3e9-180897432dce\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.505671 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-catalog-content\") pod \"63468524-5b53-4451-a3e9-180897432dce\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.505695 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs4hp\" (UniqueName: \"kubernetes.io/projected/63468524-5b53-4451-a3e9-180897432dce-kube-api-access-xs4hp\") pod \"63468524-5b53-4451-a3e9-180897432dce\" (UID: \"63468524-5b53-4451-a3e9-180897432dce\") " Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.507175 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-utilities" (OuterVolumeSpecName: "utilities") pod "63468524-5b53-4451-a3e9-180897432dce" (UID: "63468524-5b53-4451-a3e9-180897432dce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.512256 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63468524-5b53-4451-a3e9-180897432dce-kube-api-access-xs4hp" (OuterVolumeSpecName: "kube-api-access-xs4hp") pod "63468524-5b53-4451-a3e9-180897432dce" (UID: "63468524-5b53-4451-a3e9-180897432dce"). InnerVolumeSpecName "kube-api-access-xs4hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.527352 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63468524-5b53-4451-a3e9-180897432dce" (UID: "63468524-5b53-4451-a3e9-180897432dce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.607893 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.608295 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63468524-5b53-4451-a3e9-180897432dce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.608309 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs4hp\" (UniqueName: \"kubernetes.io/projected/63468524-5b53-4451-a3e9-180897432dce-kube-api-access-xs4hp\") on node \"crc\" DevicePath \"\"" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.871133 4763 generic.go:334] "Generic (PLEG): container finished" podID="63468524-5b53-4451-a3e9-180897432dce" containerID="73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676" exitCode=0 Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.872186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27btm" event={"ID":"63468524-5b53-4451-a3e9-180897432dce","Type":"ContainerDied","Data":"73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676"} Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.872298 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27btm" event={"ID":"63468524-5b53-4451-a3e9-180897432dce","Type":"ContainerDied","Data":"a7320002f091eaee81b9beddb18757b3698a4df07f411f3e419658bb37401fec"} Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.872345 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27btm" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.872363 4763 scope.go:117] "RemoveContainer" containerID="73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.900199 4763 scope.go:117] "RemoveContainer" containerID="eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.916298 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27btm"] Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.927730 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27btm"] Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.938125 4763 scope.go:117] "RemoveContainer" containerID="b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.972740 4763 scope.go:117] "RemoveContainer" containerID="73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676" Dec 01 09:50:14 crc kubenswrapper[4763]: E1201 09:50:14.973210 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676\": container with ID starting with 73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676 not found: ID does not exist" containerID="73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.973237 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676"} err="failed to get container status \"73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676\": rpc error: code = NotFound desc = could not find container \"73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676\": container with ID starting with 73c9b2d8c1f1c7d3d887295572dae81b4c711f4e0daad752c36fc893b5ab9676 not found: ID does not exist" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.973256 4763 scope.go:117] "RemoveContainer" containerID="eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db" Dec 01 09:50:14 crc kubenswrapper[4763]: E1201 09:50:14.973664 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db\": container with ID starting with eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db not found: ID does not exist" containerID="eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.973742 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db"} err="failed to get container status \"eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db\": rpc error: code = NotFound desc = could not find container \"eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db\": container with ID starting with eacb5a0b8089ed49d1da143a886251d3c8f6ee4e86274eb2621357adf7d0c9db not found: ID does not exist" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.974158 4763 scope.go:117] "RemoveContainer" containerID="b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677" Dec 01 09:50:14 crc kubenswrapper[4763]: E1201 09:50:14.974654 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677\": container with ID starting with b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677 not found: ID does not exist" containerID="b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677" Dec 01 09:50:14 crc kubenswrapper[4763]: I1201 09:50:14.974688 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677"} err="failed to get container status \"b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677\": rpc error: code = NotFound desc = could not find container \"b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677\": container with ID starting with b8a79ca2bcb19132e6f36a5d244766dae75ee7dcd828659efb64f041c4d2f677 not found: ID does not exist" Dec 01 09:50:15 crc kubenswrapper[4763]: I1201 09:50:15.006124 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63468524-5b53-4451-a3e9-180897432dce" path="/var/lib/kubelet/pods/63468524-5b53-4451-a3e9-180897432dce/volumes" Dec 01 09:50:33 crc kubenswrapper[4763]: I1201 09:50:33.930579 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:50:33 crc kubenswrapper[4763]: I1201 09:50:33.931201 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:51:03 crc kubenswrapper[4763]: I1201 09:51:03.929316 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:51:03 crc kubenswrapper[4763]: I1201 09:51:03.929898 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:51:03 crc kubenswrapper[4763]: I1201 09:51:03.929949 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:51:03 crc kubenswrapper[4763]: I1201 09:51:03.930631 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c17a35b99de2373701725812b0239f382de45e170970d427ea69e40f3be13c9"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:51:03 crc kubenswrapper[4763]: I1201 09:51:03.930680 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://9c17a35b99de2373701725812b0239f382de45e170970d427ea69e40f3be13c9" gracePeriod=600 Dec 01 09:51:04 crc kubenswrapper[4763]: I1201 09:51:04.328672 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="9c17a35b99de2373701725812b0239f382de45e170970d427ea69e40f3be13c9" exitCode=0 Dec 01 09:51:04 crc kubenswrapper[4763]: I1201 09:51:04.328924 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"9c17a35b99de2373701725812b0239f382de45e170970d427ea69e40f3be13c9"} Dec 01 09:51:04 crc kubenswrapper[4763]: I1201 09:51:04.328949 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7"} Dec 01 09:51:04 crc kubenswrapper[4763]: I1201 09:51:04.328964 4763 scope.go:117] "RemoveContainer" containerID="e5ae749f6cb977aa9ccca0bb9753002fdb474998f80dbcbc064bc52683c39991" Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.467797 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.477537 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.489415 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-849cx"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.496049 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5h9kc"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.502734 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.509088 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.515760 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.521757 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hz9sr"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.528296 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.536792 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d6xbm"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.545545 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-wq4b2"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.552992 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-m5tgx"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.560733 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.568135 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.577984 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.585188 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hz9sr"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.591976 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5fr4d"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.598665 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vr2tt"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.604713 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-r29f9"] Dec 01 09:51:58 crc kubenswrapper[4763]: I1201 09:51:58.611090 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-7tkkc"] Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.006202 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16706493-814b-4cc5-821d-f484a2059376" path="/var/lib/kubelet/pods/16706493-814b-4cc5-821d-f484a2059376/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.007558 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ea7914-fefd-4dd4-a1aa-f0593874e6c2" path="/var/lib/kubelet/pods/25ea7914-fefd-4dd4-a1aa-f0593874e6c2/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.008487 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52acb043-844e-4401-8453-075ba1a4c174" path="/var/lib/kubelet/pods/52acb043-844e-4401-8453-075ba1a4c174/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.009412 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="852eca7e-7a83-49d7-9950-aabc202ec4ec" path="/var/lib/kubelet/pods/852eca7e-7a83-49d7-9950-aabc202ec4ec/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.011116 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2922a2d-b44c-4e37-9edc-d7e51fdea83f" path="/var/lib/kubelet/pods/a2922a2d-b44c-4e37-9edc-d7e51fdea83f/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.011895 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59b383d-c6e5-43da-8066-484e944a3ea8" path="/var/lib/kubelet/pods/a59b383d-c6e5-43da-8066-484e944a3ea8/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.012624 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a0f61c-6313-4799-9c99-47866415c99a" path="/var/lib/kubelet/pods/b0a0f61c-6313-4799-9c99-47866415c99a/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.014076 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb" path="/var/lib/kubelet/pods/cbe5fb5f-fe35-49e4-b769-f6ca0a0483bb/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.015026 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66619ba-16d0-4218-bf3a-652bf97bdcce" path="/var/lib/kubelet/pods/e66619ba-16d0-4218-bf3a-652bf97bdcce/volumes" Dec 01 09:51:59 crc kubenswrapper[4763]: I1201 09:51:59.015902 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f723fec0-c702-47aa-bf56-ea502c4ce783" path="/var/lib/kubelet/pods/f723fec0-c702-47aa-bf56-ea502c4ce783/volumes" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.440898 4763 scope.go:117] "RemoveContainer" containerID="4df950c366625a1709d317852f7fe9b799184b0fc7592c128a1cac2f99788d89" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.514164 4763 scope.go:117] "RemoveContainer" containerID="0170caf3836ae474cbda75d254d35da72aeaffe23ef3589ece01c9157b2dee2d" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.550219 4763 scope.go:117] "RemoveContainer" containerID="91c844f118225e46f64f77db9eec96e265f22dd5d99dd382eec99958c04c31b3" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.591148 4763 scope.go:117] "RemoveContainer" containerID="d89e372174d4a35a7ed139f057cb2cc71d9b1c04dbee1ce33f0c43f20a50d2c0" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.649615 4763 scope.go:117] "RemoveContainer" containerID="4999825601fa720f6c3cf485d0c40831c8c89250b2ad404aee56a5d0a9ae2d6f" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.754814 4763 scope.go:117] "RemoveContainer" containerID="af6d94ecfb66a1908fd633d6ef168f9b2083f29a5fee46489319682d3fcd12e7" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.820397 4763 scope.go:117] "RemoveContainer" containerID="ee920331a3d92b032e0fe611dc65f142fec59e6cdb2dbd316e404b35845892ac" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.860321 4763 scope.go:117] "RemoveContainer" containerID="5bc222005413f0654dd2a17f3cce34a9e960b5c4004cc5e7fb9a3dbea40cbf19" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.916444 4763 scope.go:117] "RemoveContainer" containerID="c5a7cebdabb6c4c6fab4657d54f8f5c667aa6223750edba127017d2341db364c" Dec 01 09:52:04 crc kubenswrapper[4763]: I1201 09:52:04.945230 4763 scope.go:117] "RemoveContainer" containerID="9f3f5d3da66093649aa278b8d4ac87629454876c9e3a2dc521c37ed532b588c6" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.508183 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm"] Dec 01 09:52:12 crc kubenswrapper[4763]: E1201 09:52:12.513156 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63468524-5b53-4451-a3e9-180897432dce" containerName="extract-utilities" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.513180 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="63468524-5b53-4451-a3e9-180897432dce" containerName="extract-utilities" Dec 01 09:52:12 crc kubenswrapper[4763]: E1201 09:52:12.513189 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63468524-5b53-4451-a3e9-180897432dce" containerName="extract-content" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.513196 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="63468524-5b53-4451-a3e9-180897432dce" containerName="extract-content" Dec 01 09:52:12 crc kubenswrapper[4763]: E1201 09:52:12.513210 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63468524-5b53-4451-a3e9-180897432dce" containerName="registry-server" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.513216 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="63468524-5b53-4451-a3e9-180897432dce" containerName="registry-server" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.513450 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="63468524-5b53-4451-a3e9-180897432dce" containerName="registry-server" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.514153 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.517312 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm"] Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.517330 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.517395 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.517497 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.518862 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.521672 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.585798 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vggxc\" (UniqueName: \"kubernetes.io/projected/09695b88-6f6f-469a-b41a-02cd50e1f216-kube-api-access-vggxc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.585929 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.585967 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.586108 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.586193 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.688243 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.688296 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.688347 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.688399 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.688438 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vggxc\" (UniqueName: \"kubernetes.io/projected/09695b88-6f6f-469a-b41a-02cd50e1f216-kube-api-access-vggxc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.695925 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.696062 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.696059 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.697648 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.713333 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vggxc\" (UniqueName: \"kubernetes.io/projected/09695b88-6f6f-469a-b41a-02cd50e1f216-kube-api-access-vggxc\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:12 crc kubenswrapper[4763]: I1201 09:52:12.829948 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:13 crc kubenswrapper[4763]: I1201 09:52:13.361441 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm"] Dec 01 09:52:14 crc kubenswrapper[4763]: I1201 09:52:14.275252 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" event={"ID":"09695b88-6f6f-469a-b41a-02cd50e1f216","Type":"ContainerStarted","Data":"05b6ac602ed2f5b0fd9e0dec5ab316fc44b6fb83b3fad6d0d271fec9f01e11a9"} Dec 01 09:52:14 crc kubenswrapper[4763]: I1201 09:52:14.275629 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" event={"ID":"09695b88-6f6f-469a-b41a-02cd50e1f216","Type":"ContainerStarted","Data":"f66a802aa8744cbc74de0cec6b8093de150ab37df0d7df56e11f3e4dd7db21a7"} Dec 01 09:52:14 crc kubenswrapper[4763]: I1201 09:52:14.297095 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" podStartSLOduration=1.730892507 podStartE2EDuration="2.297076285s" podCreationTimestamp="2025-12-01 09:52:12 +0000 UTC" firstStartedPulling="2025-12-01 09:52:13.368031699 +0000 UTC m=+2250.636680467" lastFinishedPulling="2025-12-01 09:52:13.934215477 +0000 UTC m=+2251.202864245" observedRunningTime="2025-12-01 09:52:14.290699235 +0000 UTC m=+2251.559348003" watchObservedRunningTime="2025-12-01 09:52:14.297076285 +0000 UTC m=+2251.565725053" Dec 01 09:52:27 crc kubenswrapper[4763]: I1201 09:52:27.383289 4763 generic.go:334] "Generic (PLEG): container finished" podID="09695b88-6f6f-469a-b41a-02cd50e1f216" containerID="05b6ac602ed2f5b0fd9e0dec5ab316fc44b6fb83b3fad6d0d271fec9f01e11a9" exitCode=0 Dec 01 09:52:27 crc kubenswrapper[4763]: I1201 09:52:27.383389 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" event={"ID":"09695b88-6f6f-469a-b41a-02cd50e1f216","Type":"ContainerDied","Data":"05b6ac602ed2f5b0fd9e0dec5ab316fc44b6fb83b3fad6d0d271fec9f01e11a9"} Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.828441 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.971267 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vggxc\" (UniqueName: \"kubernetes.io/projected/09695b88-6f6f-469a-b41a-02cd50e1f216-kube-api-access-vggxc\") pod \"09695b88-6f6f-469a-b41a-02cd50e1f216\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.971348 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-repo-setup-combined-ca-bundle\") pod \"09695b88-6f6f-469a-b41a-02cd50e1f216\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.971428 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ceph\") pod \"09695b88-6f6f-469a-b41a-02cd50e1f216\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.971491 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ssh-key\") pod \"09695b88-6f6f-469a-b41a-02cd50e1f216\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.971550 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-inventory\") pod \"09695b88-6f6f-469a-b41a-02cd50e1f216\" (UID: \"09695b88-6f6f-469a-b41a-02cd50e1f216\") " Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.978740 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "09695b88-6f6f-469a-b41a-02cd50e1f216" (UID: "09695b88-6f6f-469a-b41a-02cd50e1f216"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.981395 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09695b88-6f6f-469a-b41a-02cd50e1f216-kube-api-access-vggxc" (OuterVolumeSpecName: "kube-api-access-vggxc") pod "09695b88-6f6f-469a-b41a-02cd50e1f216" (UID: "09695b88-6f6f-469a-b41a-02cd50e1f216"). InnerVolumeSpecName "kube-api-access-vggxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:52:28 crc kubenswrapper[4763]: I1201 09:52:28.986658 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ceph" (OuterVolumeSpecName: "ceph") pod "09695b88-6f6f-469a-b41a-02cd50e1f216" (UID: "09695b88-6f6f-469a-b41a-02cd50e1f216"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.002926 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "09695b88-6f6f-469a-b41a-02cd50e1f216" (UID: "09695b88-6f6f-469a-b41a-02cd50e1f216"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.011227 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-inventory" (OuterVolumeSpecName: "inventory") pod "09695b88-6f6f-469a-b41a-02cd50e1f216" (UID: "09695b88-6f6f-469a-b41a-02cd50e1f216"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.073631 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.073663 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vggxc\" (UniqueName: \"kubernetes.io/projected/09695b88-6f6f-469a-b41a-02cd50e1f216-kube-api-access-vggxc\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.073677 4763 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.073687 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.073696 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/09695b88-6f6f-469a-b41a-02cd50e1f216-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.413941 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" event={"ID":"09695b88-6f6f-469a-b41a-02cd50e1f216","Type":"ContainerDied","Data":"f66a802aa8744cbc74de0cec6b8093de150ab37df0d7df56e11f3e4dd7db21a7"} Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.415723 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f66a802aa8744cbc74de0cec6b8093de150ab37df0d7df56e11f3e4dd7db21a7" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.415861 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.505410 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj"] Dec 01 09:52:29 crc kubenswrapper[4763]: E1201 09:52:29.505960 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09695b88-6f6f-469a-b41a-02cd50e1f216" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.505986 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="09695b88-6f6f-469a-b41a-02cd50e1f216" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.506223 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="09695b88-6f6f-469a-b41a-02cd50e1f216" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.506935 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.516651 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.516894 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.517044 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.517282 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.526731 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.533095 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj"] Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.587889 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.588231 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcxj7\" (UniqueName: \"kubernetes.io/projected/249234f7-8f79-4a99-a35b-d43677150bf6-kube-api-access-qcxj7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.588415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.588717 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.588863 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.690594 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.690687 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcxj7\" (UniqueName: \"kubernetes.io/projected/249234f7-8f79-4a99-a35b-d43677150bf6-kube-api-access-qcxj7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.690751 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.690815 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.690870 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.695216 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.696109 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.699148 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.707381 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.714434 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcxj7\" (UniqueName: \"kubernetes.io/projected/249234f7-8f79-4a99-a35b-d43677150bf6-kube-api-access-qcxj7\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:29 crc kubenswrapper[4763]: I1201 09:52:29.827381 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:52:30 crc kubenswrapper[4763]: I1201 09:52:30.329728 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj"] Dec 01 09:52:30 crc kubenswrapper[4763]: W1201 09:52:30.335374 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod249234f7_8f79_4a99_a35b_d43677150bf6.slice/crio-7810d587225249e3207d63bdc72e1164fed37b3026352e7deaf419198575f0b5 WatchSource:0}: Error finding container 7810d587225249e3207d63bdc72e1164fed37b3026352e7deaf419198575f0b5: Status 404 returned error can't find the container with id 7810d587225249e3207d63bdc72e1164fed37b3026352e7deaf419198575f0b5 Dec 01 09:52:30 crc kubenswrapper[4763]: I1201 09:52:30.421507 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" event={"ID":"249234f7-8f79-4a99-a35b-d43677150bf6","Type":"ContainerStarted","Data":"7810d587225249e3207d63bdc72e1164fed37b3026352e7deaf419198575f0b5"} Dec 01 09:52:31 crc kubenswrapper[4763]: I1201 09:52:31.431986 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" event={"ID":"249234f7-8f79-4a99-a35b-d43677150bf6","Type":"ContainerStarted","Data":"bb79c2d9d8234e12f38a764a49c5145a1c35e9bce273a2f60f1af04ce916a435"} Dec 01 09:52:31 crc kubenswrapper[4763]: I1201 09:52:31.452357 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" podStartSLOduration=1.968960981 podStartE2EDuration="2.452339924s" podCreationTimestamp="2025-12-01 09:52:29 +0000 UTC" firstStartedPulling="2025-12-01 09:52:30.337718063 +0000 UTC m=+2267.606366831" lastFinishedPulling="2025-12-01 09:52:30.821097006 +0000 UTC m=+2268.089745774" observedRunningTime="2025-12-01 09:52:31.447970247 +0000 UTC m=+2268.716619015" watchObservedRunningTime="2025-12-01 09:52:31.452339924 +0000 UTC m=+2268.720988692" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.461650 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8v6b5"] Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.465688 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.488537 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8v6b5"] Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.490504 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-catalog-content\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.490625 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-utilities\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.490812 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzmm\" (UniqueName: \"kubernetes.io/projected/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-kube-api-access-qtzmm\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.591677 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzmm\" (UniqueName: \"kubernetes.io/projected/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-kube-api-access-qtzmm\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.591815 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-catalog-content\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.591858 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-utilities\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.592264 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-catalog-content\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.592324 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-utilities\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.613378 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzmm\" (UniqueName: \"kubernetes.io/projected/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-kube-api-access-qtzmm\") pod \"community-operators-8v6b5\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:35 crc kubenswrapper[4763]: I1201 09:52:35.796067 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:36 crc kubenswrapper[4763]: I1201 09:52:36.541246 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8v6b5"] Dec 01 09:52:37 crc kubenswrapper[4763]: I1201 09:52:37.527138 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerID="add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc" exitCode=0 Dec 01 09:52:37 crc kubenswrapper[4763]: I1201 09:52:37.527530 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6b5" event={"ID":"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8","Type":"ContainerDied","Data":"add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc"} Dec 01 09:52:37 crc kubenswrapper[4763]: I1201 09:52:37.527875 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6b5" event={"ID":"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8","Type":"ContainerStarted","Data":"c7149d7c99b768bfdb93c354624a86619c9377dcbdc4a31552b5b89111f6558f"} Dec 01 09:52:38 crc kubenswrapper[4763]: I1201 09:52:38.537784 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6b5" event={"ID":"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8","Type":"ContainerStarted","Data":"3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9"} Dec 01 09:52:40 crc kubenswrapper[4763]: I1201 09:52:40.556769 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerID="3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9" exitCode=0 Dec 01 09:52:40 crc kubenswrapper[4763]: I1201 09:52:40.556840 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6b5" event={"ID":"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8","Type":"ContainerDied","Data":"3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9"} Dec 01 09:52:41 crc kubenswrapper[4763]: I1201 09:52:41.569548 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6b5" event={"ID":"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8","Type":"ContainerStarted","Data":"f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041"} Dec 01 09:52:41 crc kubenswrapper[4763]: I1201 09:52:41.591978 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8v6b5" podStartSLOduration=3.030976841 podStartE2EDuration="6.591957352s" podCreationTimestamp="2025-12-01 09:52:35 +0000 UTC" firstStartedPulling="2025-12-01 09:52:37.531520299 +0000 UTC m=+2274.800169067" lastFinishedPulling="2025-12-01 09:52:41.09250081 +0000 UTC m=+2278.361149578" observedRunningTime="2025-12-01 09:52:41.590994767 +0000 UTC m=+2278.859643555" watchObservedRunningTime="2025-12-01 09:52:41.591957352 +0000 UTC m=+2278.860606120" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.069149 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8d7rs"] Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.075893 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.081006 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d7rs"] Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.140955 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cdcn\" (UniqueName: \"kubernetes.io/projected/eb919315-6912-4b58-85b9-5d61df859508-kube-api-access-8cdcn\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.141271 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-utilities\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.141855 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-catalog-content\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.244352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cdcn\" (UniqueName: \"kubernetes.io/projected/eb919315-6912-4b58-85b9-5d61df859508-kube-api-access-8cdcn\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.244702 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-utilities\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.244823 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-catalog-content\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.245186 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-catalog-content\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.245435 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-utilities\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.265011 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cdcn\" (UniqueName: \"kubernetes.io/projected/eb919315-6912-4b58-85b9-5d61df859508-kube-api-access-8cdcn\") pod \"certified-operators-8d7rs\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.401583 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:43 crc kubenswrapper[4763]: I1201 09:52:43.983995 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8d7rs"] Dec 01 09:52:44 crc kubenswrapper[4763]: I1201 09:52:44.597318 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb919315-6912-4b58-85b9-5d61df859508" containerID="5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8" exitCode=0 Dec 01 09:52:44 crc kubenswrapper[4763]: I1201 09:52:44.597387 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d7rs" event={"ID":"eb919315-6912-4b58-85b9-5d61df859508","Type":"ContainerDied","Data":"5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8"} Dec 01 09:52:44 crc kubenswrapper[4763]: I1201 09:52:44.597649 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d7rs" event={"ID":"eb919315-6912-4b58-85b9-5d61df859508","Type":"ContainerStarted","Data":"123070500e284a291c6b38aa6317b1058b659a641e3aaf100fa327fadf80c543"} Dec 01 09:52:45 crc kubenswrapper[4763]: I1201 09:52:45.612293 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d7rs" event={"ID":"eb919315-6912-4b58-85b9-5d61df859508","Type":"ContainerStarted","Data":"cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c"} Dec 01 09:52:45 crc kubenswrapper[4763]: I1201 09:52:45.796820 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:45 crc kubenswrapper[4763]: I1201 09:52:45.797200 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:45 crc kubenswrapper[4763]: I1201 09:52:45.863935 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:46 crc kubenswrapper[4763]: I1201 09:52:46.663322 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:47 crc kubenswrapper[4763]: I1201 09:52:47.637346 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb919315-6912-4b58-85b9-5d61df859508" containerID="cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c" exitCode=0 Dec 01 09:52:47 crc kubenswrapper[4763]: I1201 09:52:47.637417 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d7rs" event={"ID":"eb919315-6912-4b58-85b9-5d61df859508","Type":"ContainerDied","Data":"cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c"} Dec 01 09:52:48 crc kubenswrapper[4763]: I1201 09:52:48.238207 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8v6b5"] Dec 01 09:52:48 crc kubenswrapper[4763]: I1201 09:52:48.647441 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d7rs" event={"ID":"eb919315-6912-4b58-85b9-5d61df859508","Type":"ContainerStarted","Data":"5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b"} Dec 01 09:52:48 crc kubenswrapper[4763]: I1201 09:52:48.647588 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8v6b5" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerName="registry-server" containerID="cri-o://f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041" gracePeriod=2 Dec 01 09:52:48 crc kubenswrapper[4763]: I1201 09:52:48.673593 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8d7rs" podStartSLOduration=2.198697118 podStartE2EDuration="5.673572426s" podCreationTimestamp="2025-12-01 09:52:43 +0000 UTC" firstStartedPulling="2025-12-01 09:52:44.601148752 +0000 UTC m=+2281.869797520" lastFinishedPulling="2025-12-01 09:52:48.07602406 +0000 UTC m=+2285.344672828" observedRunningTime="2025-12-01 09:52:48.666729324 +0000 UTC m=+2285.935378092" watchObservedRunningTime="2025-12-01 09:52:48.673572426 +0000 UTC m=+2285.942221204" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.150408 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.240700 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtzmm\" (UniqueName: \"kubernetes.io/projected/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-kube-api-access-qtzmm\") pod \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.240798 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-utilities\") pod \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.240844 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-catalog-content\") pod \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\" (UID: \"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8\") " Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.242374 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-utilities" (OuterVolumeSpecName: "utilities") pod "6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" (UID: "6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.260703 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-kube-api-access-qtzmm" (OuterVolumeSpecName: "kube-api-access-qtzmm") pod "6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" (UID: "6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8"). InnerVolumeSpecName "kube-api-access-qtzmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.286614 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" (UID: "6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.342536 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtzmm\" (UniqueName: \"kubernetes.io/projected/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-kube-api-access-qtzmm\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.342756 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.342812 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.660220 4763 generic.go:334] "Generic (PLEG): container finished" podID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerID="f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041" exitCode=0 Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.660266 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6b5" event={"ID":"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8","Type":"ContainerDied","Data":"f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041"} Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.660292 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8v6b5" event={"ID":"6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8","Type":"ContainerDied","Data":"c7149d7c99b768bfdb93c354624a86619c9377dcbdc4a31552b5b89111f6558f"} Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.660308 4763 scope.go:117] "RemoveContainer" containerID="f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.660423 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8v6b5" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.683199 4763 scope.go:117] "RemoveContainer" containerID="3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.704335 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8v6b5"] Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.716018 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8v6b5"] Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.716840 4763 scope.go:117] "RemoveContainer" containerID="add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.762687 4763 scope.go:117] "RemoveContainer" containerID="f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041" Dec 01 09:52:49 crc kubenswrapper[4763]: E1201 09:52:49.763244 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041\": container with ID starting with f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041 not found: ID does not exist" containerID="f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.763293 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041"} err="failed to get container status \"f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041\": rpc error: code = NotFound desc = could not find container \"f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041\": container with ID starting with f03fb0ce12c49efc3c67cb897924bf97bed7815722aeb4868783870ea450f041 not found: ID does not exist" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.763319 4763 scope.go:117] "RemoveContainer" containerID="3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9" Dec 01 09:52:49 crc kubenswrapper[4763]: E1201 09:52:49.763904 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9\": container with ID starting with 3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9 not found: ID does not exist" containerID="3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.763954 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9"} err="failed to get container status \"3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9\": rpc error: code = NotFound desc = could not find container \"3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9\": container with ID starting with 3f48e40b47806de8b5ab22e5e334850becb36173c1284916ec8888e9187b60b9 not found: ID does not exist" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.763986 4763 scope.go:117] "RemoveContainer" containerID="add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc" Dec 01 09:52:49 crc kubenswrapper[4763]: E1201 09:52:49.764384 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc\": container with ID starting with add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc not found: ID does not exist" containerID="add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc" Dec 01 09:52:49 crc kubenswrapper[4763]: I1201 09:52:49.764408 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc"} err="failed to get container status \"add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc\": rpc error: code = NotFound desc = could not find container \"add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc\": container with ID starting with add277b5f2cb99be1caa9d955086719addbe97b57a99e8ca8926778219984edc not found: ID does not exist" Dec 01 09:52:51 crc kubenswrapper[4763]: I1201 09:52:51.006192 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" path="/var/lib/kubelet/pods/6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8/volumes" Dec 01 09:52:53 crc kubenswrapper[4763]: I1201 09:52:53.402595 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:53 crc kubenswrapper[4763]: I1201 09:52:53.402999 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:53 crc kubenswrapper[4763]: I1201 09:52:53.456106 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:53 crc kubenswrapper[4763]: I1201 09:52:53.744086 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:54 crc kubenswrapper[4763]: I1201 09:52:54.237932 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d7rs"] Dec 01 09:52:55 crc kubenswrapper[4763]: I1201 09:52:55.705494 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8d7rs" podUID="eb919315-6912-4b58-85b9-5d61df859508" containerName="registry-server" containerID="cri-o://5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b" gracePeriod=2 Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.150496 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.305850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cdcn\" (UniqueName: \"kubernetes.io/projected/eb919315-6912-4b58-85b9-5d61df859508-kube-api-access-8cdcn\") pod \"eb919315-6912-4b58-85b9-5d61df859508\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.305944 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-utilities\") pod \"eb919315-6912-4b58-85b9-5d61df859508\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.306105 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-catalog-content\") pod \"eb919315-6912-4b58-85b9-5d61df859508\" (UID: \"eb919315-6912-4b58-85b9-5d61df859508\") " Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.307063 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-utilities" (OuterVolumeSpecName: "utilities") pod "eb919315-6912-4b58-85b9-5d61df859508" (UID: "eb919315-6912-4b58-85b9-5d61df859508"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.311700 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb919315-6912-4b58-85b9-5d61df859508-kube-api-access-8cdcn" (OuterVolumeSpecName: "kube-api-access-8cdcn") pod "eb919315-6912-4b58-85b9-5d61df859508" (UID: "eb919315-6912-4b58-85b9-5d61df859508"). InnerVolumeSpecName "kube-api-access-8cdcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.357800 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb919315-6912-4b58-85b9-5d61df859508" (UID: "eb919315-6912-4b58-85b9-5d61df859508"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.408498 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.408555 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cdcn\" (UniqueName: \"kubernetes.io/projected/eb919315-6912-4b58-85b9-5d61df859508-kube-api-access-8cdcn\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.408569 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb919315-6912-4b58-85b9-5d61df859508-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.714523 4763 generic.go:334] "Generic (PLEG): container finished" podID="eb919315-6912-4b58-85b9-5d61df859508" containerID="5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b" exitCode=0 Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.714830 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d7rs" event={"ID":"eb919315-6912-4b58-85b9-5d61df859508","Type":"ContainerDied","Data":"5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b"} Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.714858 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8d7rs" event={"ID":"eb919315-6912-4b58-85b9-5d61df859508","Type":"ContainerDied","Data":"123070500e284a291c6b38aa6317b1058b659a641e3aaf100fa327fadf80c543"} Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.714875 4763 scope.go:117] "RemoveContainer" containerID="5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.715000 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8d7rs" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.743914 4763 scope.go:117] "RemoveContainer" containerID="cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.750901 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8d7rs"] Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.758245 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8d7rs"] Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.771802 4763 scope.go:117] "RemoveContainer" containerID="5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.803753 4763 scope.go:117] "RemoveContainer" containerID="5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b" Dec 01 09:52:56 crc kubenswrapper[4763]: E1201 09:52:56.804124 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b\": container with ID starting with 5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b not found: ID does not exist" containerID="5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.804151 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b"} err="failed to get container status \"5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b\": rpc error: code = NotFound desc = could not find container \"5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b\": container with ID starting with 5b5781f6dc96ce2fecb85b51ec8b175ed2d6ab55d4936d896254472660faa22b not found: ID does not exist" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.804177 4763 scope.go:117] "RemoveContainer" containerID="cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c" Dec 01 09:52:56 crc kubenswrapper[4763]: E1201 09:52:56.804498 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c\": container with ID starting with cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c not found: ID does not exist" containerID="cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.804524 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c"} err="failed to get container status \"cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c\": rpc error: code = NotFound desc = could not find container \"cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c\": container with ID starting with cdf7ba6abed833f35e47075d5f8d3dea81764f7e4a9d3870bb14af7cfc6a842c not found: ID does not exist" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.804541 4763 scope.go:117] "RemoveContainer" containerID="5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8" Dec 01 09:52:56 crc kubenswrapper[4763]: E1201 09:52:56.804877 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8\": container with ID starting with 5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8 not found: ID does not exist" containerID="5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8" Dec 01 09:52:56 crc kubenswrapper[4763]: I1201 09:52:56.804917 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8"} err="failed to get container status \"5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8\": rpc error: code = NotFound desc = could not find container \"5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8\": container with ID starting with 5f02900c19b7171ee505c77e781d04888a53617242a63feeee07b5a46e04ccc8 not found: ID does not exist" Dec 01 09:52:57 crc kubenswrapper[4763]: I1201 09:52:57.004395 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb919315-6912-4b58-85b9-5d61df859508" path="/var/lib/kubelet/pods/eb919315-6912-4b58-85b9-5d61df859508/volumes" Dec 01 09:53:33 crc kubenswrapper[4763]: I1201 09:53:33.929563 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:53:33 crc kubenswrapper[4763]: I1201 09:53:33.930281 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:54:03 crc kubenswrapper[4763]: I1201 09:54:03.930207 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:54:03 crc kubenswrapper[4763]: I1201 09:54:03.931241 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:54:23 crc kubenswrapper[4763]: I1201 09:54:23.419557 4763 generic.go:334] "Generic (PLEG): container finished" podID="249234f7-8f79-4a99-a35b-d43677150bf6" containerID="bb79c2d9d8234e12f38a764a49c5145a1c35e9bce273a2f60f1af04ce916a435" exitCode=0 Dec 01 09:54:23 crc kubenswrapper[4763]: I1201 09:54:23.419647 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" event={"ID":"249234f7-8f79-4a99-a35b-d43677150bf6","Type":"ContainerDied","Data":"bb79c2d9d8234e12f38a764a49c5145a1c35e9bce273a2f60f1af04ce916a435"} Dec 01 09:54:24 crc kubenswrapper[4763]: I1201 09:54:24.833829 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.004082 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ssh-key\") pod \"249234f7-8f79-4a99-a35b-d43677150bf6\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.004233 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory\") pod \"249234f7-8f79-4a99-a35b-d43677150bf6\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.004282 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-bootstrap-combined-ca-bundle\") pod \"249234f7-8f79-4a99-a35b-d43677150bf6\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.004540 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ceph\") pod \"249234f7-8f79-4a99-a35b-d43677150bf6\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.005225 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcxj7\" (UniqueName: \"kubernetes.io/projected/249234f7-8f79-4a99-a35b-d43677150bf6-kube-api-access-qcxj7\") pod \"249234f7-8f79-4a99-a35b-d43677150bf6\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.010176 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ceph" (OuterVolumeSpecName: "ceph") pod "249234f7-8f79-4a99-a35b-d43677150bf6" (UID: "249234f7-8f79-4a99-a35b-d43677150bf6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.025757 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249234f7-8f79-4a99-a35b-d43677150bf6-kube-api-access-qcxj7" (OuterVolumeSpecName: "kube-api-access-qcxj7") pod "249234f7-8f79-4a99-a35b-d43677150bf6" (UID: "249234f7-8f79-4a99-a35b-d43677150bf6"). InnerVolumeSpecName "kube-api-access-qcxj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.026600 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "249234f7-8f79-4a99-a35b-d43677150bf6" (UID: "249234f7-8f79-4a99-a35b-d43677150bf6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:54:25 crc kubenswrapper[4763]: E1201 09:54:25.030613 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory podName:249234f7-8f79-4a99-a35b-d43677150bf6 nodeName:}" failed. No retries permitted until 2025-12-01 09:54:25.53058622 +0000 UTC m=+2382.799234998 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory") pod "249234f7-8f79-4a99-a35b-d43677150bf6" (UID: "249234f7-8f79-4a99-a35b-d43677150bf6") : error deleting /var/lib/kubelet/pods/249234f7-8f79-4a99-a35b-d43677150bf6/volume-subpaths: remove /var/lib/kubelet/pods/249234f7-8f79-4a99-a35b-d43677150bf6/volume-subpaths: no such file or directory Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.034517 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "249234f7-8f79-4a99-a35b-d43677150bf6" (UID: "249234f7-8f79-4a99-a35b-d43677150bf6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.108072 4763 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.108120 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.108141 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcxj7\" (UniqueName: \"kubernetes.io/projected/249234f7-8f79-4a99-a35b-d43677150bf6-kube-api-access-qcxj7\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.108156 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.434678 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" event={"ID":"249234f7-8f79-4a99-a35b-d43677150bf6","Type":"ContainerDied","Data":"7810d587225249e3207d63bdc72e1164fed37b3026352e7deaf419198575f0b5"} Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.434902 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7810d587225249e3207d63bdc72e1164fed37b3026352e7deaf419198575f0b5" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.434711 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.556507 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7"] Dec 01 09:54:25 crc kubenswrapper[4763]: E1201 09:54:25.556839 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb919315-6912-4b58-85b9-5d61df859508" containerName="registry-server" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.556854 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb919315-6912-4b58-85b9-5d61df859508" containerName="registry-server" Dec 01 09:54:25 crc kubenswrapper[4763]: E1201 09:54:25.556867 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerName="registry-server" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.556875 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerName="registry-server" Dec 01 09:54:25 crc kubenswrapper[4763]: E1201 09:54:25.556891 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb919315-6912-4b58-85b9-5d61df859508" containerName="extract-content" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.556898 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb919315-6912-4b58-85b9-5d61df859508" containerName="extract-content" Dec 01 09:54:25 crc kubenswrapper[4763]: E1201 09:54:25.556908 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerName="extract-content" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.556914 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerName="extract-content" Dec 01 09:54:25 crc kubenswrapper[4763]: E1201 09:54:25.556923 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249234f7-8f79-4a99-a35b-d43677150bf6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.556930 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="249234f7-8f79-4a99-a35b-d43677150bf6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:54:25 crc kubenswrapper[4763]: E1201 09:54:25.556947 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb919315-6912-4b58-85b9-5d61df859508" containerName="extract-utilities" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.556953 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb919315-6912-4b58-85b9-5d61df859508" containerName="extract-utilities" Dec 01 09:54:25 crc kubenswrapper[4763]: E1201 09:54:25.556965 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerName="extract-utilities" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.556970 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerName="extract-utilities" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.557158 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb919315-6912-4b58-85b9-5d61df859508" containerName="registry-server" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.557174 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="249234f7-8f79-4a99-a35b-d43677150bf6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.557193 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef6d913-50ee-4ac8-abc6-6bd5a5e13da8" containerName="registry-server" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.557796 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.578399 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7"] Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.614276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory\") pod \"249234f7-8f79-4a99-a35b-d43677150bf6\" (UID: \"249234f7-8f79-4a99-a35b-d43677150bf6\") " Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.615119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vht5g\" (UniqueName: \"kubernetes.io/projected/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-kube-api-access-vht5g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.615173 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.615197 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.615243 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.624657 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory" (OuterVolumeSpecName: "inventory") pod "249234f7-8f79-4a99-a35b-d43677150bf6" (UID: "249234f7-8f79-4a99-a35b-d43677150bf6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.716716 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.716760 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.716806 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.716913 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vht5g\" (UniqueName: \"kubernetes.io/projected/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-kube-api-access-vht5g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.716976 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/249234f7-8f79-4a99-a35b-d43677150bf6-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.721231 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.721371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.721394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.733884 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vht5g\" (UniqueName: \"kubernetes.io/projected/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-kube-api-access-vht5g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:25 crc kubenswrapper[4763]: I1201 09:54:25.871796 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:26 crc kubenswrapper[4763]: I1201 09:54:26.376240 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7"] Dec 01 09:54:26 crc kubenswrapper[4763]: I1201 09:54:26.442110 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" event={"ID":"9076b862-2a04-47bc-a6f6-bb99cd48ec2b","Type":"ContainerStarted","Data":"247dfc76956c742a0bdc8873e784e3ae7720f632df4f65d15fdc5386a7c65f61"} Dec 01 09:54:27 crc kubenswrapper[4763]: I1201 09:54:27.453488 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" event={"ID":"9076b862-2a04-47bc-a6f6-bb99cd48ec2b","Type":"ContainerStarted","Data":"9df24f844bb1be753211a2730b43131e1126d0db57dbfc792640c6e0908aabfa"} Dec 01 09:54:33 crc kubenswrapper[4763]: I1201 09:54:33.929777 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 09:54:33 crc kubenswrapper[4763]: I1201 09:54:33.930410 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 09:54:33 crc kubenswrapper[4763]: I1201 09:54:33.930488 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 09:54:33 crc kubenswrapper[4763]: I1201 09:54:33.931286 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 09:54:33 crc kubenswrapper[4763]: I1201 09:54:33.931347 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" gracePeriod=600 Dec 01 09:54:34 crc kubenswrapper[4763]: E1201 09:54:34.069879 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:54:34 crc kubenswrapper[4763]: I1201 09:54:34.508218 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" exitCode=0 Dec 01 09:54:34 crc kubenswrapper[4763]: I1201 09:54:34.508724 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7"} Dec 01 09:54:34 crc kubenswrapper[4763]: I1201 09:54:34.508784 4763 scope.go:117] "RemoveContainer" containerID="9c17a35b99de2373701725812b0239f382de45e170970d427ea69e40f3be13c9" Dec 01 09:54:34 crc kubenswrapper[4763]: I1201 09:54:34.509603 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:54:34 crc kubenswrapper[4763]: E1201 09:54:34.509966 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:54:34 crc kubenswrapper[4763]: I1201 09:54:34.531838 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" podStartSLOduration=8.98393742 podStartE2EDuration="9.531820067s" podCreationTimestamp="2025-12-01 09:54:25 +0000 UTC" firstStartedPulling="2025-12-01 09:54:26.386516569 +0000 UTC m=+2383.655165337" lastFinishedPulling="2025-12-01 09:54:26.934399216 +0000 UTC m=+2384.203047984" observedRunningTime="2025-12-01 09:54:27.47481159 +0000 UTC m=+2384.743460358" watchObservedRunningTime="2025-12-01 09:54:34.531820067 +0000 UTC m=+2391.800468835" Dec 01 09:54:48 crc kubenswrapper[4763]: I1201 09:54:48.994397 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:54:48 crc kubenswrapper[4763]: E1201 09:54:48.995938 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:54:54 crc kubenswrapper[4763]: I1201 09:54:54.674631 4763 generic.go:334] "Generic (PLEG): container finished" podID="9076b862-2a04-47bc-a6f6-bb99cd48ec2b" containerID="9df24f844bb1be753211a2730b43131e1126d0db57dbfc792640c6e0908aabfa" exitCode=0 Dec 01 09:54:54 crc kubenswrapper[4763]: I1201 09:54:54.674719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" event={"ID":"9076b862-2a04-47bc-a6f6-bb99cd48ec2b","Type":"ContainerDied","Data":"9df24f844bb1be753211a2730b43131e1126d0db57dbfc792640c6e0908aabfa"} Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.038109 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.164468 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ceph\") pod \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.164660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ssh-key\") pod \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.164727 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vht5g\" (UniqueName: \"kubernetes.io/projected/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-kube-api-access-vht5g\") pod \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.164766 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-inventory\") pod \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\" (UID: \"9076b862-2a04-47bc-a6f6-bb99cd48ec2b\") " Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.170847 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ceph" (OuterVolumeSpecName: "ceph") pod "9076b862-2a04-47bc-a6f6-bb99cd48ec2b" (UID: "9076b862-2a04-47bc-a6f6-bb99cd48ec2b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.174040 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-kube-api-access-vht5g" (OuterVolumeSpecName: "kube-api-access-vht5g") pod "9076b862-2a04-47bc-a6f6-bb99cd48ec2b" (UID: "9076b862-2a04-47bc-a6f6-bb99cd48ec2b"). InnerVolumeSpecName "kube-api-access-vht5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.195632 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-inventory" (OuterVolumeSpecName: "inventory") pod "9076b862-2a04-47bc-a6f6-bb99cd48ec2b" (UID: "9076b862-2a04-47bc-a6f6-bb99cd48ec2b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.198392 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9076b862-2a04-47bc-a6f6-bb99cd48ec2b" (UID: "9076b862-2a04-47bc-a6f6-bb99cd48ec2b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.267149 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.267195 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vht5g\" (UniqueName: \"kubernetes.io/projected/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-kube-api-access-vht5g\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.267210 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.267222 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9076b862-2a04-47bc-a6f6-bb99cd48ec2b-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.691664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" event={"ID":"9076b862-2a04-47bc-a6f6-bb99cd48ec2b","Type":"ContainerDied","Data":"247dfc76956c742a0bdc8873e784e3ae7720f632df4f65d15fdc5386a7c65f61"} Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.691697 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.691734 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="247dfc76956c742a0bdc8873e784e3ae7720f632df4f65d15fdc5386a7c65f61" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.790965 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv"] Dec 01 09:54:56 crc kubenswrapper[4763]: E1201 09:54:56.792322 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9076b862-2a04-47bc-a6f6-bb99cd48ec2b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.792537 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9076b862-2a04-47bc-a6f6-bb99cd48ec2b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.793084 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9076b862-2a04-47bc-a6f6-bb99cd48ec2b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.794367 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.801691 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.801788 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.802011 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.802121 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.803526 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.828190 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv"] Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.979474 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.979615 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.979669 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:56 crc kubenswrapper[4763]: I1201 09:54:56.979731 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f55nv\" (UniqueName: \"kubernetes.io/projected/0fee9e86-e1af-4201-9817-bf22f5910477-kube-api-access-f55nv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.081899 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.081986 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.082080 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f55nv\" (UniqueName: \"kubernetes.io/projected/0fee9e86-e1af-4201-9817-bf22f5910477-kube-api-access-f55nv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.082154 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.086972 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.087572 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.095210 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.110887 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f55nv\" (UniqueName: \"kubernetes.io/projected/0fee9e86-e1af-4201-9817-bf22f5910477-kube-api-access-f55nv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.155661 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.683566 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv"] Dec 01 09:54:57 crc kubenswrapper[4763]: I1201 09:54:57.709219 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" event={"ID":"0fee9e86-e1af-4201-9817-bf22f5910477","Type":"ContainerStarted","Data":"f6d2f0726b1171531979f6bb73a4af1a1374ae7f44092816d5bfe1bee34fcd9c"} Dec 01 09:54:58 crc kubenswrapper[4763]: I1201 09:54:58.720099 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" event={"ID":"0fee9e86-e1af-4201-9817-bf22f5910477","Type":"ContainerStarted","Data":"7ae16690ce2fe8072844cb037c2545db0e5fabc0e353a90c41d274a444da7edf"} Dec 01 09:54:58 crc kubenswrapper[4763]: I1201 09:54:58.739762 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" podStartSLOduration=2.261502627 podStartE2EDuration="2.739740762s" podCreationTimestamp="2025-12-01 09:54:56 +0000 UTC" firstStartedPulling="2025-12-01 09:54:57.69784892 +0000 UTC m=+2414.966497688" lastFinishedPulling="2025-12-01 09:54:58.176087055 +0000 UTC m=+2415.444735823" observedRunningTime="2025-12-01 09:54:58.738024075 +0000 UTC m=+2416.006672843" watchObservedRunningTime="2025-12-01 09:54:58.739740762 +0000 UTC m=+2416.008389530" Dec 01 09:55:01 crc kubenswrapper[4763]: I1201 09:55:01.994268 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:55:01 crc kubenswrapper[4763]: E1201 09:55:01.994967 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:55:04 crc kubenswrapper[4763]: I1201 09:55:04.768110 4763 generic.go:334] "Generic (PLEG): container finished" podID="0fee9e86-e1af-4201-9817-bf22f5910477" containerID="7ae16690ce2fe8072844cb037c2545db0e5fabc0e353a90c41d274a444da7edf" exitCode=0 Dec 01 09:55:04 crc kubenswrapper[4763]: I1201 09:55:04.768159 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" event={"ID":"0fee9e86-e1af-4201-9817-bf22f5910477","Type":"ContainerDied","Data":"7ae16690ce2fe8072844cb037c2545db0e5fabc0e353a90c41d274a444da7edf"} Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.170068 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.263495 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ceph\") pod \"0fee9e86-e1af-4201-9817-bf22f5910477\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.263552 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ssh-key\") pod \"0fee9e86-e1af-4201-9817-bf22f5910477\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.263618 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-inventory\") pod \"0fee9e86-e1af-4201-9817-bf22f5910477\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.263752 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f55nv\" (UniqueName: \"kubernetes.io/projected/0fee9e86-e1af-4201-9817-bf22f5910477-kube-api-access-f55nv\") pod \"0fee9e86-e1af-4201-9817-bf22f5910477\" (UID: \"0fee9e86-e1af-4201-9817-bf22f5910477\") " Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.269895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fee9e86-e1af-4201-9817-bf22f5910477-kube-api-access-f55nv" (OuterVolumeSpecName: "kube-api-access-f55nv") pod "0fee9e86-e1af-4201-9817-bf22f5910477" (UID: "0fee9e86-e1af-4201-9817-bf22f5910477"). InnerVolumeSpecName "kube-api-access-f55nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.270636 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ceph" (OuterVolumeSpecName: "ceph") pod "0fee9e86-e1af-4201-9817-bf22f5910477" (UID: "0fee9e86-e1af-4201-9817-bf22f5910477"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.311120 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-inventory" (OuterVolumeSpecName: "inventory") pod "0fee9e86-e1af-4201-9817-bf22f5910477" (UID: "0fee9e86-e1af-4201-9817-bf22f5910477"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.314176 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0fee9e86-e1af-4201-9817-bf22f5910477" (UID: "0fee9e86-e1af-4201-9817-bf22f5910477"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.365802 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f55nv\" (UniqueName: \"kubernetes.io/projected/0fee9e86-e1af-4201-9817-bf22f5910477-kube-api-access-f55nv\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.365838 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.365849 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.365860 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0fee9e86-e1af-4201-9817-bf22f5910477-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.783816 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" event={"ID":"0fee9e86-e1af-4201-9817-bf22f5910477","Type":"ContainerDied","Data":"f6d2f0726b1171531979f6bb73a4af1a1374ae7f44092816d5bfe1bee34fcd9c"} Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.784387 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6d2f0726b1171531979f6bb73a4af1a1374ae7f44092816d5bfe1bee34fcd9c" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.783878 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.902215 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf"] Dec 01 09:55:06 crc kubenswrapper[4763]: E1201 09:55:06.902686 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fee9e86-e1af-4201-9817-bf22f5910477" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.902714 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fee9e86-e1af-4201-9817-bf22f5910477" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.902952 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fee9e86-e1af-4201-9817-bf22f5910477" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.903719 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.905631 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.905790 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.905851 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.906524 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.907571 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:55:06 crc kubenswrapper[4763]: I1201 09:55:06.918291 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf"] Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.075246 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79zt\" (UniqueName: \"kubernetes.io/projected/feb1b8da-b40b-439e-a27e-3f78045bbf86-kube-api-access-q79zt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.075302 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.075436 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.075526 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.177234 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q79zt\" (UniqueName: \"kubernetes.io/projected/feb1b8da-b40b-439e-a27e-3f78045bbf86-kube-api-access-q79zt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.177380 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.177468 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.177567 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.183218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.183300 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.195080 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.199163 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79zt\" (UniqueName: \"kubernetes.io/projected/feb1b8da-b40b-439e-a27e-3f78045bbf86-kube-api-access-q79zt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-c97lf\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.286780 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.853612 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf"] Dec 01 09:55:07 crc kubenswrapper[4763]: I1201 09:55:07.867112 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 09:55:09 crc kubenswrapper[4763]: I1201 09:55:09.096656 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" event={"ID":"feb1b8da-b40b-439e-a27e-3f78045bbf86","Type":"ContainerStarted","Data":"7d5e06c859830d3d70238c23f4a1ad37e3ce73aa89408bd57851b46360859fe8"} Dec 01 09:55:10 crc kubenswrapper[4763]: I1201 09:55:10.101269 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" event={"ID":"feb1b8da-b40b-439e-a27e-3f78045bbf86","Type":"ContainerStarted","Data":"644475988e7d9ae3a4ce3c3d432dc1f8d7161d4429fa94149e711f1bf9c1b0a1"} Dec 01 09:55:10 crc kubenswrapper[4763]: I1201 09:55:10.126170 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" podStartSLOduration=2.508143133 podStartE2EDuration="4.126150828s" podCreationTimestamp="2025-12-01 09:55:06 +0000 UTC" firstStartedPulling="2025-12-01 09:55:07.86685541 +0000 UTC m=+2425.135504188" lastFinishedPulling="2025-12-01 09:55:09.484863115 +0000 UTC m=+2426.753511883" observedRunningTime="2025-12-01 09:55:10.120165816 +0000 UTC m=+2427.388814584" watchObservedRunningTime="2025-12-01 09:55:10.126150828 +0000 UTC m=+2427.394799596" Dec 01 09:55:15 crc kubenswrapper[4763]: I1201 09:55:15.994608 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:55:15 crc kubenswrapper[4763]: E1201 09:55:15.995448 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:55:29 crc kubenswrapper[4763]: I1201 09:55:29.994582 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:55:29 crc kubenswrapper[4763]: E1201 09:55:29.995317 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:55:42 crc kubenswrapper[4763]: I1201 09:55:42.999150 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:55:43 crc kubenswrapper[4763]: E1201 09:55:42.999911 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:55:52 crc kubenswrapper[4763]: I1201 09:55:52.469492 4763 generic.go:334] "Generic (PLEG): container finished" podID="feb1b8da-b40b-439e-a27e-3f78045bbf86" containerID="644475988e7d9ae3a4ce3c3d432dc1f8d7161d4429fa94149e711f1bf9c1b0a1" exitCode=0 Dec 01 09:55:52 crc kubenswrapper[4763]: I1201 09:55:52.469541 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" event={"ID":"feb1b8da-b40b-439e-a27e-3f78045bbf86","Type":"ContainerDied","Data":"644475988e7d9ae3a4ce3c3d432dc1f8d7161d4429fa94149e711f1bf9c1b0a1"} Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.871937 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.924900 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ceph\") pod \"feb1b8da-b40b-439e-a27e-3f78045bbf86\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.924988 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q79zt\" (UniqueName: \"kubernetes.io/projected/feb1b8da-b40b-439e-a27e-3f78045bbf86-kube-api-access-q79zt\") pod \"feb1b8da-b40b-439e-a27e-3f78045bbf86\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.925034 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ssh-key\") pod \"feb1b8da-b40b-439e-a27e-3f78045bbf86\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.925085 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-inventory\") pod \"feb1b8da-b40b-439e-a27e-3f78045bbf86\" (UID: \"feb1b8da-b40b-439e-a27e-3f78045bbf86\") " Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.931899 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb1b8da-b40b-439e-a27e-3f78045bbf86-kube-api-access-q79zt" (OuterVolumeSpecName: "kube-api-access-q79zt") pod "feb1b8da-b40b-439e-a27e-3f78045bbf86" (UID: "feb1b8da-b40b-439e-a27e-3f78045bbf86"). InnerVolumeSpecName "kube-api-access-q79zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.932184 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ceph" (OuterVolumeSpecName: "ceph") pod "feb1b8da-b40b-439e-a27e-3f78045bbf86" (UID: "feb1b8da-b40b-439e-a27e-3f78045bbf86"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.958713 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "feb1b8da-b40b-439e-a27e-3f78045bbf86" (UID: "feb1b8da-b40b-439e-a27e-3f78045bbf86"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:55:53 crc kubenswrapper[4763]: I1201 09:55:53.965639 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-inventory" (OuterVolumeSpecName: "inventory") pod "feb1b8da-b40b-439e-a27e-3f78045bbf86" (UID: "feb1b8da-b40b-439e-a27e-3f78045bbf86"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.027203 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q79zt\" (UniqueName: \"kubernetes.io/projected/feb1b8da-b40b-439e-a27e-3f78045bbf86-kube-api-access-q79zt\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.027945 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.028016 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.028085 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feb1b8da-b40b-439e-a27e-3f78045bbf86-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.488681 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" event={"ID":"feb1b8da-b40b-439e-a27e-3f78045bbf86","Type":"ContainerDied","Data":"7d5e06c859830d3d70238c23f4a1ad37e3ce73aa89408bd57851b46360859fe8"} Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.489195 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d5e06c859830d3d70238c23f4a1ad37e3ce73aa89408bd57851b46360859fe8" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.488728 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-c97lf" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.587282 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp"] Dec 01 09:55:54 crc kubenswrapper[4763]: E1201 09:55:54.587635 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb1b8da-b40b-439e-a27e-3f78045bbf86" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.587653 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb1b8da-b40b-439e-a27e-3f78045bbf86" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.587828 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb1b8da-b40b-439e-a27e-3f78045bbf86" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.588428 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.591108 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.591226 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.591161 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.591603 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.592909 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.602616 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp"] Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.637547 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp28d\" (UniqueName: \"kubernetes.io/projected/93089553-1487-4d3b-ab46-1cd7822aa6ad-kube-api-access-rp28d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.637710 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.637756 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.637794 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.739238 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.739316 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.739346 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.739485 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp28d\" (UniqueName: \"kubernetes.io/projected/93089553-1487-4d3b-ab46-1cd7822aa6ad-kube-api-access-rp28d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.744415 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.744641 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.744894 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.771309 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp28d\" (UniqueName: \"kubernetes.io/projected/93089553-1487-4d3b-ab46-1cd7822aa6ad-kube-api-access-rp28d\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.924909 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:55:54 crc kubenswrapper[4763]: I1201 09:55:54.994551 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:55:54 crc kubenswrapper[4763]: E1201 09:55:54.994814 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:55:55 crc kubenswrapper[4763]: I1201 09:55:55.446699 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp"] Dec 01 09:55:55 crc kubenswrapper[4763]: I1201 09:55:55.498065 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" event={"ID":"93089553-1487-4d3b-ab46-1cd7822aa6ad","Type":"ContainerStarted","Data":"3399c729a9c8f7d6f054890198daebee830ed12014908dedc272b9b9f1ee3dac"} Dec 01 09:55:56 crc kubenswrapper[4763]: I1201 09:55:56.512404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" event={"ID":"93089553-1487-4d3b-ab46-1cd7822aa6ad","Type":"ContainerStarted","Data":"dc8ebdd5ce8a1a8452707f9d4d665e964cd34ac733d1356bc6603335ce827e39"} Dec 01 09:55:56 crc kubenswrapper[4763]: I1201 09:55:56.536620 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" podStartSLOduration=1.818278903 podStartE2EDuration="2.536606333s" podCreationTimestamp="2025-12-01 09:55:54 +0000 UTC" firstStartedPulling="2025-12-01 09:55:55.454050309 +0000 UTC m=+2472.722699077" lastFinishedPulling="2025-12-01 09:55:56.172377739 +0000 UTC m=+2473.441026507" observedRunningTime="2025-12-01 09:55:56.533492628 +0000 UTC m=+2473.802141396" watchObservedRunningTime="2025-12-01 09:55:56.536606333 +0000 UTC m=+2473.805255101" Dec 01 09:56:00 crc kubenswrapper[4763]: I1201 09:56:00.549128 4763 generic.go:334] "Generic (PLEG): container finished" podID="93089553-1487-4d3b-ab46-1cd7822aa6ad" containerID="dc8ebdd5ce8a1a8452707f9d4d665e964cd34ac733d1356bc6603335ce827e39" exitCode=0 Dec 01 09:56:00 crc kubenswrapper[4763]: I1201 09:56:00.549238 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" event={"ID":"93089553-1487-4d3b-ab46-1cd7822aa6ad","Type":"ContainerDied","Data":"dc8ebdd5ce8a1a8452707f9d4d665e964cd34ac733d1356bc6603335ce827e39"} Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.064005 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.213344 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ceph\") pod \"93089553-1487-4d3b-ab46-1cd7822aa6ad\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.213774 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ssh-key\") pod \"93089553-1487-4d3b-ab46-1cd7822aa6ad\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.214926 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-inventory\") pod \"93089553-1487-4d3b-ab46-1cd7822aa6ad\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.215363 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp28d\" (UniqueName: \"kubernetes.io/projected/93089553-1487-4d3b-ab46-1cd7822aa6ad-kube-api-access-rp28d\") pod \"93089553-1487-4d3b-ab46-1cd7822aa6ad\" (UID: \"93089553-1487-4d3b-ab46-1cd7822aa6ad\") " Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.232549 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93089553-1487-4d3b-ab46-1cd7822aa6ad-kube-api-access-rp28d" (OuterVolumeSpecName: "kube-api-access-rp28d") pod "93089553-1487-4d3b-ab46-1cd7822aa6ad" (UID: "93089553-1487-4d3b-ab46-1cd7822aa6ad"). InnerVolumeSpecName "kube-api-access-rp28d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.240566 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ceph" (OuterVolumeSpecName: "ceph") pod "93089553-1487-4d3b-ab46-1cd7822aa6ad" (UID: "93089553-1487-4d3b-ab46-1cd7822aa6ad"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.246829 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "93089553-1487-4d3b-ab46-1cd7822aa6ad" (UID: "93089553-1487-4d3b-ab46-1cd7822aa6ad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.249009 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-inventory" (OuterVolumeSpecName: "inventory") pod "93089553-1487-4d3b-ab46-1cd7822aa6ad" (UID: "93089553-1487-4d3b-ab46-1cd7822aa6ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.318034 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.318106 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.318121 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93089553-1487-4d3b-ab46-1cd7822aa6ad-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.318137 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp28d\" (UniqueName: \"kubernetes.io/projected/93089553-1487-4d3b-ab46-1cd7822aa6ad-kube-api-access-rp28d\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.568692 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" event={"ID":"93089553-1487-4d3b-ab46-1cd7822aa6ad","Type":"ContainerDied","Data":"3399c729a9c8f7d6f054890198daebee830ed12014908dedc272b9b9f1ee3dac"} Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.568765 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3399c729a9c8f7d6f054890198daebee830ed12014908dedc272b9b9f1ee3dac" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.568764 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.692415 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw"] Dec 01 09:56:02 crc kubenswrapper[4763]: E1201 09:56:02.693467 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93089553-1487-4d3b-ab46-1cd7822aa6ad" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.693565 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="93089553-1487-4d3b-ab46-1cd7822aa6ad" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.693843 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="93089553-1487-4d3b-ab46-1cd7822aa6ad" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.694688 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.697813 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.697858 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.698344 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.698752 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.702590 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.710031 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw"] Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.842870 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t495w\" (UniqueName: \"kubernetes.io/projected/66848849-497d-4488-898a-c529d1ef2736-kube-api-access-t495w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.843021 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.843211 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.843352 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.947229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t495w\" (UniqueName: \"kubernetes.io/projected/66848849-497d-4488-898a-c529d1ef2736-kube-api-access-t495w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.947379 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.947537 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.947587 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.956288 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.956978 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.962219 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:02 crc kubenswrapper[4763]: I1201 09:56:02.967755 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t495w\" (UniqueName: \"kubernetes.io/projected/66848849-497d-4488-898a-c529d1ef2736-kube-api-access-t495w\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:03 crc kubenswrapper[4763]: I1201 09:56:03.034721 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:03 crc kubenswrapper[4763]: I1201 09:56:03.739368 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw"] Dec 01 09:56:04 crc kubenswrapper[4763]: I1201 09:56:04.635965 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" event={"ID":"66848849-497d-4488-898a-c529d1ef2736","Type":"ContainerStarted","Data":"171d948cfe6237de4e734d9ce6c1d0d9396e1bc138d210e2082d1408758f5b5d"} Dec 01 09:56:04 crc kubenswrapper[4763]: I1201 09:56:04.636418 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" event={"ID":"66848849-497d-4488-898a-c529d1ef2736","Type":"ContainerStarted","Data":"cb8f7527cb251ac1952a2b67181f87b1fa09d28fdd3231b431f762f365e44e97"} Dec 01 09:56:04 crc kubenswrapper[4763]: I1201 09:56:04.666483 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" podStartSLOduration=2.124110035 podStartE2EDuration="2.666442936s" podCreationTimestamp="2025-12-01 09:56:02 +0000 UTC" firstStartedPulling="2025-12-01 09:56:03.757234989 +0000 UTC m=+2481.025883757" lastFinishedPulling="2025-12-01 09:56:04.29956789 +0000 UTC m=+2481.568216658" observedRunningTime="2025-12-01 09:56:04.656496014 +0000 UTC m=+2481.925144782" watchObservedRunningTime="2025-12-01 09:56:04.666442936 +0000 UTC m=+2481.935091714" Dec 01 09:56:08 crc kubenswrapper[4763]: I1201 09:56:08.994603 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:56:08 crc kubenswrapper[4763]: E1201 09:56:08.995370 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.280123 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v7np7"] Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.287450 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.307216 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v7np7"] Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.456294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfhjj\" (UniqueName: \"kubernetes.io/projected/0cfa6e54-7506-41da-887e-8134b455bf9f-kube-api-access-hfhjj\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.456509 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-utilities\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.456663 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-catalog-content\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.559185 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfhjj\" (UniqueName: \"kubernetes.io/projected/0cfa6e54-7506-41da-887e-8134b455bf9f-kube-api-access-hfhjj\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.559595 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-utilities\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.559736 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-catalog-content\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.560058 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-utilities\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.560130 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-catalog-content\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.578410 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfhjj\" (UniqueName: \"kubernetes.io/projected/0cfa6e54-7506-41da-887e-8134b455bf9f-kube-api-access-hfhjj\") pod \"redhat-operators-v7np7\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:12 crc kubenswrapper[4763]: I1201 09:56:12.632442 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:13 crc kubenswrapper[4763]: I1201 09:56:13.203637 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v7np7"] Dec 01 09:56:13 crc kubenswrapper[4763]: I1201 09:56:13.722279 4763 generic.go:334] "Generic (PLEG): container finished" podID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerID="34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5" exitCode=0 Dec 01 09:56:13 crc kubenswrapper[4763]: I1201 09:56:13.722982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7np7" event={"ID":"0cfa6e54-7506-41da-887e-8134b455bf9f","Type":"ContainerDied","Data":"34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5"} Dec 01 09:56:13 crc kubenswrapper[4763]: I1201 09:56:13.723008 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7np7" event={"ID":"0cfa6e54-7506-41da-887e-8134b455bf9f","Type":"ContainerStarted","Data":"40b91ca9d7e7dc80173fa30c65842201bbc6b2044b4f7920e375b25f274f46d8"} Dec 01 09:56:14 crc kubenswrapper[4763]: I1201 09:56:14.731489 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7np7" event={"ID":"0cfa6e54-7506-41da-887e-8134b455bf9f","Type":"ContainerStarted","Data":"e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59"} Dec 01 09:56:17 crc kubenswrapper[4763]: I1201 09:56:17.756825 4763 generic.go:334] "Generic (PLEG): container finished" podID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerID="e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59" exitCode=0 Dec 01 09:56:17 crc kubenswrapper[4763]: I1201 09:56:17.756892 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7np7" event={"ID":"0cfa6e54-7506-41da-887e-8134b455bf9f","Type":"ContainerDied","Data":"e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59"} Dec 01 09:56:19 crc kubenswrapper[4763]: I1201 09:56:19.776373 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7np7" event={"ID":"0cfa6e54-7506-41da-887e-8134b455bf9f","Type":"ContainerStarted","Data":"fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f"} Dec 01 09:56:19 crc kubenswrapper[4763]: I1201 09:56:19.796632 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v7np7" podStartSLOduration=2.828984914 podStartE2EDuration="7.79661516s" podCreationTimestamp="2025-12-01 09:56:12 +0000 UTC" firstStartedPulling="2025-12-01 09:56:13.725418986 +0000 UTC m=+2490.994067754" lastFinishedPulling="2025-12-01 09:56:18.693049222 +0000 UTC m=+2495.961698000" observedRunningTime="2025-12-01 09:56:19.793801203 +0000 UTC m=+2497.062449971" watchObservedRunningTime="2025-12-01 09:56:19.79661516 +0000 UTC m=+2497.065263928" Dec 01 09:56:21 crc kubenswrapper[4763]: I1201 09:56:21.994341 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:56:21 crc kubenswrapper[4763]: E1201 09:56:21.994725 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:56:22 crc kubenswrapper[4763]: I1201 09:56:22.632961 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:22 crc kubenswrapper[4763]: I1201 09:56:22.633619 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:23 crc kubenswrapper[4763]: I1201 09:56:23.683747 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v7np7" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="registry-server" probeResult="failure" output=< Dec 01 09:56:23 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 09:56:23 crc kubenswrapper[4763]: > Dec 01 09:56:32 crc kubenswrapper[4763]: I1201 09:56:32.690547 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:32 crc kubenswrapper[4763]: I1201 09:56:32.761724 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:32 crc kubenswrapper[4763]: I1201 09:56:32.927449 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v7np7"] Dec 01 09:56:33 crc kubenswrapper[4763]: I1201 09:56:33.889619 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v7np7" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="registry-server" containerID="cri-o://fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f" gracePeriod=2 Dec 01 09:56:33 crc kubenswrapper[4763]: I1201 09:56:33.994319 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:56:33 crc kubenswrapper[4763]: E1201 09:56:33.994694 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.343077 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.371737 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-utilities\") pod \"0cfa6e54-7506-41da-887e-8134b455bf9f\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.371871 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfhjj\" (UniqueName: \"kubernetes.io/projected/0cfa6e54-7506-41da-887e-8134b455bf9f-kube-api-access-hfhjj\") pod \"0cfa6e54-7506-41da-887e-8134b455bf9f\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.371898 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-catalog-content\") pod \"0cfa6e54-7506-41da-887e-8134b455bf9f\" (UID: \"0cfa6e54-7506-41da-887e-8134b455bf9f\") " Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.372619 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-utilities" (OuterVolumeSpecName: "utilities") pod "0cfa6e54-7506-41da-887e-8134b455bf9f" (UID: "0cfa6e54-7506-41da-887e-8134b455bf9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.393768 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cfa6e54-7506-41da-887e-8134b455bf9f-kube-api-access-hfhjj" (OuterVolumeSpecName: "kube-api-access-hfhjj") pod "0cfa6e54-7506-41da-887e-8134b455bf9f" (UID: "0cfa6e54-7506-41da-887e-8134b455bf9f"). InnerVolumeSpecName "kube-api-access-hfhjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.474084 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.474441 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfhjj\" (UniqueName: \"kubernetes.io/projected/0cfa6e54-7506-41da-887e-8134b455bf9f-kube-api-access-hfhjj\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.511567 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cfa6e54-7506-41da-887e-8134b455bf9f" (UID: "0cfa6e54-7506-41da-887e-8134b455bf9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.576152 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cfa6e54-7506-41da-887e-8134b455bf9f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.901608 4763 generic.go:334] "Generic (PLEG): container finished" podID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerID="fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f" exitCode=0 Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.901699 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v7np7" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.901719 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7np7" event={"ID":"0cfa6e54-7506-41da-887e-8134b455bf9f","Type":"ContainerDied","Data":"fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f"} Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.903693 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v7np7" event={"ID":"0cfa6e54-7506-41da-887e-8134b455bf9f","Type":"ContainerDied","Data":"40b91ca9d7e7dc80173fa30c65842201bbc6b2044b4f7920e375b25f274f46d8"} Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.903720 4763 scope.go:117] "RemoveContainer" containerID="fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.931094 4763 scope.go:117] "RemoveContainer" containerID="e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59" Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.951498 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v7np7"] Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.958845 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v7np7"] Dec 01 09:56:34 crc kubenswrapper[4763]: I1201 09:56:34.964703 4763 scope.go:117] "RemoveContainer" containerID="34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5" Dec 01 09:56:35 crc kubenswrapper[4763]: I1201 09:56:35.005376 4763 scope.go:117] "RemoveContainer" containerID="fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f" Dec 01 09:56:35 crc kubenswrapper[4763]: E1201 09:56:35.005788 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f\": container with ID starting with fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f not found: ID does not exist" containerID="fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f" Dec 01 09:56:35 crc kubenswrapper[4763]: I1201 09:56:35.005829 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f"} err="failed to get container status \"fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f\": rpc error: code = NotFound desc = could not find container \"fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f\": container with ID starting with fed4c4702630b3ad591b9732dd486841ee7945cc48c7815b77eedc20dcc56b9f not found: ID does not exist" Dec 01 09:56:35 crc kubenswrapper[4763]: I1201 09:56:35.005855 4763 scope.go:117] "RemoveContainer" containerID="e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59" Dec 01 09:56:35 crc kubenswrapper[4763]: E1201 09:56:35.006129 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59\": container with ID starting with e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59 not found: ID does not exist" containerID="e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59" Dec 01 09:56:35 crc kubenswrapper[4763]: I1201 09:56:35.006160 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59"} err="failed to get container status \"e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59\": rpc error: code = NotFound desc = could not find container \"e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59\": container with ID starting with e0e1dfb8ec070bd8c0d7339f3305f2cbc21da2a3ef2c82952d15869790977b59 not found: ID does not exist" Dec 01 09:56:35 crc kubenswrapper[4763]: I1201 09:56:35.006177 4763 scope.go:117] "RemoveContainer" containerID="34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5" Dec 01 09:56:35 crc kubenswrapper[4763]: E1201 09:56:35.006420 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5\": container with ID starting with 34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5 not found: ID does not exist" containerID="34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5" Dec 01 09:56:35 crc kubenswrapper[4763]: I1201 09:56:35.006443 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5"} err="failed to get container status \"34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5\": rpc error: code = NotFound desc = could not find container \"34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5\": container with ID starting with 34dbfc1c1f65b616f99da225399550dea542dd620b21ed87929e715bcb17ada5 not found: ID does not exist" Dec 01 09:56:35 crc kubenswrapper[4763]: I1201 09:56:35.007366 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" path="/var/lib/kubelet/pods/0cfa6e54-7506-41da-887e-8134b455bf9f/volumes" Dec 01 09:56:46 crc kubenswrapper[4763]: I1201 09:56:46.994229 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:56:46 crc kubenswrapper[4763]: E1201 09:56:46.994989 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:56:56 crc kubenswrapper[4763]: I1201 09:56:56.078478 4763 generic.go:334] "Generic (PLEG): container finished" podID="66848849-497d-4488-898a-c529d1ef2736" containerID="171d948cfe6237de4e734d9ce6c1d0d9396e1bc138d210e2082d1408758f5b5d" exitCode=0 Dec 01 09:56:56 crc kubenswrapper[4763]: I1201 09:56:56.078564 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" event={"ID":"66848849-497d-4488-898a-c529d1ef2736","Type":"ContainerDied","Data":"171d948cfe6237de4e734d9ce6c1d0d9396e1bc138d210e2082d1408758f5b5d"} Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.469836 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.599038 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ssh-key\") pod \"66848849-497d-4488-898a-c529d1ef2736\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.599100 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ceph\") pod \"66848849-497d-4488-898a-c529d1ef2736\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.599181 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t495w\" (UniqueName: \"kubernetes.io/projected/66848849-497d-4488-898a-c529d1ef2736-kube-api-access-t495w\") pod \"66848849-497d-4488-898a-c529d1ef2736\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.599253 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-inventory\") pod \"66848849-497d-4488-898a-c529d1ef2736\" (UID: \"66848849-497d-4488-898a-c529d1ef2736\") " Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.605705 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ceph" (OuterVolumeSpecName: "ceph") pod "66848849-497d-4488-898a-c529d1ef2736" (UID: "66848849-497d-4488-898a-c529d1ef2736"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.605828 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66848849-497d-4488-898a-c529d1ef2736-kube-api-access-t495w" (OuterVolumeSpecName: "kube-api-access-t495w") pod "66848849-497d-4488-898a-c529d1ef2736" (UID: "66848849-497d-4488-898a-c529d1ef2736"). InnerVolumeSpecName "kube-api-access-t495w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.631629 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "66848849-497d-4488-898a-c529d1ef2736" (UID: "66848849-497d-4488-898a-c529d1ef2736"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.641033 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-inventory" (OuterVolumeSpecName: "inventory") pod "66848849-497d-4488-898a-c529d1ef2736" (UID: "66848849-497d-4488-898a-c529d1ef2736"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.701197 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.701230 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.701239 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66848849-497d-4488-898a-c529d1ef2736-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:57 crc kubenswrapper[4763]: I1201 09:56:57.701248 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t495w\" (UniqueName: \"kubernetes.io/projected/66848849-497d-4488-898a-c529d1ef2736-kube-api-access-t495w\") on node \"crc\" DevicePath \"\"" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.096667 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" event={"ID":"66848849-497d-4488-898a-c529d1ef2736","Type":"ContainerDied","Data":"cb8f7527cb251ac1952a2b67181f87b1fa09d28fdd3231b431f762f365e44e97"} Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.096992 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb8f7527cb251ac1952a2b67181f87b1fa09d28fdd3231b431f762f365e44e97" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.096718 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.203147 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q84bf"] Dec 01 09:56:58 crc kubenswrapper[4763]: E1201 09:56:58.203882 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="registry-server" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.203907 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="registry-server" Dec 01 09:56:58 crc kubenswrapper[4763]: E1201 09:56:58.203930 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66848849-497d-4488-898a-c529d1ef2736" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.203944 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="66848849-497d-4488-898a-c529d1ef2736" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:58 crc kubenswrapper[4763]: E1201 09:56:58.203960 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="extract-content" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.203967 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="extract-content" Dec 01 09:56:58 crc kubenswrapper[4763]: E1201 09:56:58.203987 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="extract-utilities" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.203993 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="extract-utilities" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.204183 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cfa6e54-7506-41da-887e-8134b455bf9f" containerName="registry-server" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.204209 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="66848849-497d-4488-898a-c529d1ef2736" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.204855 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.206860 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.207242 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.207432 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.207683 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.207875 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.220216 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q84bf"] Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.312637 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ceph\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.312719 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97fn\" (UniqueName: \"kubernetes.io/projected/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-kube-api-access-m97fn\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.312843 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.312874 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.414000 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ceph\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.414092 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97fn\" (UniqueName: \"kubernetes.io/projected/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-kube-api-access-m97fn\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.414189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.414226 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.422261 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ceph\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.426379 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.426657 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.431965 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97fn\" (UniqueName: \"kubernetes.io/projected/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-kube-api-access-m97fn\") pod \"ssh-known-hosts-edpm-deployment-q84bf\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:58 crc kubenswrapper[4763]: I1201 09:56:58.541756 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:56:59 crc kubenswrapper[4763]: I1201 09:56:59.011777 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-q84bf"] Dec 01 09:56:59 crc kubenswrapper[4763]: I1201 09:56:59.104405 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" event={"ID":"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb","Type":"ContainerStarted","Data":"adecfb2979b8dc0115b7ccbaf74ace9e6eac0b25ac8ef41e588215e339049931"} Dec 01 09:57:00 crc kubenswrapper[4763]: I1201 09:57:00.114340 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" event={"ID":"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb","Type":"ContainerStarted","Data":"d1863c32b17ad10a2b03b8182cfe8e32ee9db49829a16f1aa8c63258b43b26b6"} Dec 01 09:57:00 crc kubenswrapper[4763]: I1201 09:57:00.141157 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" podStartSLOduration=1.660100618 podStartE2EDuration="2.141127143s" podCreationTimestamp="2025-12-01 09:56:58 +0000 UTC" firstStartedPulling="2025-12-01 09:56:59.023038458 +0000 UTC m=+2536.291687226" lastFinishedPulling="2025-12-01 09:56:59.504064983 +0000 UTC m=+2536.772713751" observedRunningTime="2025-12-01 09:57:00.13297935 +0000 UTC m=+2537.401628118" watchObservedRunningTime="2025-12-01 09:57:00.141127143 +0000 UTC m=+2537.409775931" Dec 01 09:57:01 crc kubenswrapper[4763]: I1201 09:57:01.994508 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:57:01 crc kubenswrapper[4763]: E1201 09:57:01.995035 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:57:10 crc kubenswrapper[4763]: I1201 09:57:10.206679 4763 generic.go:334] "Generic (PLEG): container finished" podID="57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb" containerID="d1863c32b17ad10a2b03b8182cfe8e32ee9db49829a16f1aa8c63258b43b26b6" exitCode=0 Dec 01 09:57:10 crc kubenswrapper[4763]: I1201 09:57:10.207128 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" event={"ID":"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb","Type":"ContainerDied","Data":"d1863c32b17ad10a2b03b8182cfe8e32ee9db49829a16f1aa8c63258b43b26b6"} Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.700834 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.883090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ssh-key-openstack-edpm-ipam\") pod \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.883190 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-inventory-0\") pod \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.883296 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m97fn\" (UniqueName: \"kubernetes.io/projected/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-kube-api-access-m97fn\") pod \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.883347 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ceph\") pod \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\" (UID: \"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb\") " Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.898675 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ceph" (OuterVolumeSpecName: "ceph") pod "57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb" (UID: "57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.912268 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb" (UID: "57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.913895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-kube-api-access-m97fn" (OuterVolumeSpecName: "kube-api-access-m97fn") pod "57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb" (UID: "57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb"). InnerVolumeSpecName "kube-api-access-m97fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.916690 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb" (UID: "57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.985316 4763 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.985343 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m97fn\" (UniqueName: \"kubernetes.io/projected/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-kube-api-access-m97fn\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.985354 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:11 crc kubenswrapper[4763]: I1201 09:57:11.985363 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.226023 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" event={"ID":"57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb","Type":"ContainerDied","Data":"adecfb2979b8dc0115b7ccbaf74ace9e6eac0b25ac8ef41e588215e339049931"} Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.226071 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adecfb2979b8dc0115b7ccbaf74ace9e6eac0b25ac8ef41e588215e339049931" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.226111 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-q84bf" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.328237 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m"] Dec 01 09:57:12 crc kubenswrapper[4763]: E1201 09:57:12.329003 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.329024 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.329343 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb" containerName="ssh-known-hosts-edpm-deployment" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.331750 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.334981 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.335232 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.335602 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.335608 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.336750 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.351081 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m"] Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.494761 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.494980 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.495025 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn7zr\" (UniqueName: \"kubernetes.io/projected/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-kube-api-access-cn7zr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.495060 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.597063 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.597128 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn7zr\" (UniqueName: \"kubernetes.io/projected/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-kube-api-access-cn7zr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.597160 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.597248 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.602509 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.603728 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.611173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.617594 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn7zr\" (UniqueName: \"kubernetes.io/projected/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-kube-api-access-cn7zr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zbl9m\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:12 crc kubenswrapper[4763]: I1201 09:57:12.647477 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:13 crc kubenswrapper[4763]: I1201 09:57:13.135556 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m"] Dec 01 09:57:13 crc kubenswrapper[4763]: I1201 09:57:13.233060 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" event={"ID":"b41d9a03-82fb-4f14-b13c-0437ae28a1a7","Type":"ContainerStarted","Data":"8343b3f2b6d4664aa283ab957540c0e7b301c2e248269967bd94ef324727e5f8"} Dec 01 09:57:14 crc kubenswrapper[4763]: I1201 09:57:14.240872 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" event={"ID":"b41d9a03-82fb-4f14-b13c-0437ae28a1a7","Type":"ContainerStarted","Data":"d6e06cac6de79ff30f8e6329c0969e4935af3569ddaec61621f0a33108e72658"} Dec 01 09:57:14 crc kubenswrapper[4763]: I1201 09:57:14.275304 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" podStartSLOduration=1.809362538 podStartE2EDuration="2.275289152s" podCreationTimestamp="2025-12-01 09:57:12 +0000 UTC" firstStartedPulling="2025-12-01 09:57:13.135188865 +0000 UTC m=+2550.403837633" lastFinishedPulling="2025-12-01 09:57:13.601115479 +0000 UTC m=+2550.869764247" observedRunningTime="2025-12-01 09:57:14.264382974 +0000 UTC m=+2551.533031742" watchObservedRunningTime="2025-12-01 09:57:14.275289152 +0000 UTC m=+2551.543937920" Dec 01 09:57:15 crc kubenswrapper[4763]: I1201 09:57:15.995066 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:57:15 crc kubenswrapper[4763]: E1201 09:57:15.995389 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:57:23 crc kubenswrapper[4763]: I1201 09:57:23.316664 4763 generic.go:334] "Generic (PLEG): container finished" podID="b41d9a03-82fb-4f14-b13c-0437ae28a1a7" containerID="d6e06cac6de79ff30f8e6329c0969e4935af3569ddaec61621f0a33108e72658" exitCode=0 Dec 01 09:57:23 crc kubenswrapper[4763]: I1201 09:57:23.316749 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" event={"ID":"b41d9a03-82fb-4f14-b13c-0437ae28a1a7","Type":"ContainerDied","Data":"d6e06cac6de79ff30f8e6329c0969e4935af3569ddaec61621f0a33108e72658"} Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.837267 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.941252 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ceph\") pod \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.941347 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-inventory\") pod \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.941376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ssh-key\") pod \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.941495 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn7zr\" (UniqueName: \"kubernetes.io/projected/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-kube-api-access-cn7zr\") pod \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\" (UID: \"b41d9a03-82fb-4f14-b13c-0437ae28a1a7\") " Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.946580 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ceph" (OuterVolumeSpecName: "ceph") pod "b41d9a03-82fb-4f14-b13c-0437ae28a1a7" (UID: "b41d9a03-82fb-4f14-b13c-0437ae28a1a7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.948692 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-kube-api-access-cn7zr" (OuterVolumeSpecName: "kube-api-access-cn7zr") pod "b41d9a03-82fb-4f14-b13c-0437ae28a1a7" (UID: "b41d9a03-82fb-4f14-b13c-0437ae28a1a7"). InnerVolumeSpecName "kube-api-access-cn7zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.972026 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-inventory" (OuterVolumeSpecName: "inventory") pod "b41d9a03-82fb-4f14-b13c-0437ae28a1a7" (UID: "b41d9a03-82fb-4f14-b13c-0437ae28a1a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:24 crc kubenswrapper[4763]: I1201 09:57:24.979505 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b41d9a03-82fb-4f14-b13c-0437ae28a1a7" (UID: "b41d9a03-82fb-4f14-b13c-0437ae28a1a7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.052393 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.052426 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.052438 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.052449 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn7zr\" (UniqueName: \"kubernetes.io/projected/b41d9a03-82fb-4f14-b13c-0437ae28a1a7-kube-api-access-cn7zr\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.345206 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" event={"ID":"b41d9a03-82fb-4f14-b13c-0437ae28a1a7","Type":"ContainerDied","Data":"8343b3f2b6d4664aa283ab957540c0e7b301c2e248269967bd94ef324727e5f8"} Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.345538 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8343b3f2b6d4664aa283ab957540c0e7b301c2e248269967bd94ef324727e5f8" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.345317 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zbl9m" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.436363 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx"] Dec 01 09:57:25 crc kubenswrapper[4763]: E1201 09:57:25.436741 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41d9a03-82fb-4f14-b13c-0437ae28a1a7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.436760 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41d9a03-82fb-4f14-b13c-0437ae28a1a7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.436953 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41d9a03-82fb-4f14-b13c-0437ae28a1a7" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.437571 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.441474 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.441559 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.441628 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.442058 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.442329 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.444634 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx"] Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.560785 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl8m8\" (UniqueName: \"kubernetes.io/projected/2308a6d4-3af7-4772-8413-3803ac516e1c-kube-api-access-gl8m8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.560896 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.560931 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.560989 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.663437 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.663637 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.663877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl8m8\" (UniqueName: \"kubernetes.io/projected/2308a6d4-3af7-4772-8413-3803ac516e1c-kube-api-access-gl8m8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.664329 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.668299 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.673983 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.676910 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.689286 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl8m8\" (UniqueName: \"kubernetes.io/projected/2308a6d4-3af7-4772-8413-3803ac516e1c-kube-api-access-gl8m8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:25 crc kubenswrapper[4763]: I1201 09:57:25.757343 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:26 crc kubenswrapper[4763]: I1201 09:57:26.324221 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx"] Dec 01 09:57:26 crc kubenswrapper[4763]: I1201 09:57:26.353040 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" event={"ID":"2308a6d4-3af7-4772-8413-3803ac516e1c","Type":"ContainerStarted","Data":"c968406fc746c72231ce0ede6925107ee45658538b76c6f48cd2267eac36003c"} Dec 01 09:57:27 crc kubenswrapper[4763]: I1201 09:57:27.362630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" event={"ID":"2308a6d4-3af7-4772-8413-3803ac516e1c","Type":"ContainerStarted","Data":"0fb55fef9e90b9749029d34e12520912220b04e1ea199449811b94a8395fcd8e"} Dec 01 09:57:27 crc kubenswrapper[4763]: I1201 09:57:27.384593 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" podStartSLOduration=1.8773873540000001 podStartE2EDuration="2.384573875s" podCreationTimestamp="2025-12-01 09:57:25 +0000 UTC" firstStartedPulling="2025-12-01 09:57:26.330516079 +0000 UTC m=+2563.599164847" lastFinishedPulling="2025-12-01 09:57:26.8377026 +0000 UTC m=+2564.106351368" observedRunningTime="2025-12-01 09:57:27.383535956 +0000 UTC m=+2564.652184724" watchObservedRunningTime="2025-12-01 09:57:27.384573875 +0000 UTC m=+2564.653222653" Dec 01 09:57:27 crc kubenswrapper[4763]: I1201 09:57:27.996247 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:57:27 crc kubenswrapper[4763]: E1201 09:57:27.996762 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:57:37 crc kubenswrapper[4763]: I1201 09:57:37.446797 4763 generic.go:334] "Generic (PLEG): container finished" podID="2308a6d4-3af7-4772-8413-3803ac516e1c" containerID="0fb55fef9e90b9749029d34e12520912220b04e1ea199449811b94a8395fcd8e" exitCode=0 Dec 01 09:57:37 crc kubenswrapper[4763]: I1201 09:57:37.446928 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" event={"ID":"2308a6d4-3af7-4772-8413-3803ac516e1c","Type":"ContainerDied","Data":"0fb55fef9e90b9749029d34e12520912220b04e1ea199449811b94a8395fcd8e"} Dec 01 09:57:38 crc kubenswrapper[4763]: I1201 09:57:38.867845 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:38 crc kubenswrapper[4763]: I1201 09:57:38.994523 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:57:38 crc kubenswrapper[4763]: E1201 09:57:38.994872 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.025230 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl8m8\" (UniqueName: \"kubernetes.io/projected/2308a6d4-3af7-4772-8413-3803ac516e1c-kube-api-access-gl8m8\") pod \"2308a6d4-3af7-4772-8413-3803ac516e1c\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.025699 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ssh-key\") pod \"2308a6d4-3af7-4772-8413-3803ac516e1c\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.025781 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-inventory\") pod \"2308a6d4-3af7-4772-8413-3803ac516e1c\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.025948 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ceph\") pod \"2308a6d4-3af7-4772-8413-3803ac516e1c\" (UID: \"2308a6d4-3af7-4772-8413-3803ac516e1c\") " Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.031256 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ceph" (OuterVolumeSpecName: "ceph") pod "2308a6d4-3af7-4772-8413-3803ac516e1c" (UID: "2308a6d4-3af7-4772-8413-3803ac516e1c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.035164 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2308a6d4-3af7-4772-8413-3803ac516e1c-kube-api-access-gl8m8" (OuterVolumeSpecName: "kube-api-access-gl8m8") pod "2308a6d4-3af7-4772-8413-3803ac516e1c" (UID: "2308a6d4-3af7-4772-8413-3803ac516e1c"). InnerVolumeSpecName "kube-api-access-gl8m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.052241 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-inventory" (OuterVolumeSpecName: "inventory") pod "2308a6d4-3af7-4772-8413-3803ac516e1c" (UID: "2308a6d4-3af7-4772-8413-3803ac516e1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.062108 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2308a6d4-3af7-4772-8413-3803ac516e1c" (UID: "2308a6d4-3af7-4772-8413-3803ac516e1c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.128880 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl8m8\" (UniqueName: \"kubernetes.io/projected/2308a6d4-3af7-4772-8413-3803ac516e1c-kube-api-access-gl8m8\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.128917 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.128928 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.128938 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2308a6d4-3af7-4772-8413-3803ac516e1c-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.464472 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" event={"ID":"2308a6d4-3af7-4772-8413-3803ac516e1c","Type":"ContainerDied","Data":"c968406fc746c72231ce0ede6925107ee45658538b76c6f48cd2267eac36003c"} Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.464517 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c968406fc746c72231ce0ede6925107ee45658538b76c6f48cd2267eac36003c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.464524 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.575989 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c"] Dec 01 09:57:39 crc kubenswrapper[4763]: E1201 09:57:39.576491 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2308a6d4-3af7-4772-8413-3803ac516e1c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.576517 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2308a6d4-3af7-4772-8413-3803ac516e1c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.576755 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2308a6d4-3af7-4772-8413-3803ac516e1c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.577559 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.580531 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.580774 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.588711 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.588986 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.589172 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.589366 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.589628 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.589813 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.595109 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c"] Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.643510 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.643663 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.643898 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzvb\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-kube-api-access-ktzvb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.643976 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644026 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644067 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644105 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644208 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644238 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644262 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644291 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.644337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.745868 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.745925 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.745995 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746051 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746085 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746148 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzvb\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-kube-api-access-ktzvb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746185 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746215 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746245 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746272 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746315 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746329 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.746350 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.750491 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.750822 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.750841 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.751271 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.752838 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.753723 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.753874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.754215 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.754240 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.755437 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.755635 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.756712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.769495 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzvb\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-kube-api-access-ktzvb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:39 crc kubenswrapper[4763]: I1201 09:57:39.898345 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:57:40 crc kubenswrapper[4763]: I1201 09:57:40.437202 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c"] Dec 01 09:57:40 crc kubenswrapper[4763]: I1201 09:57:40.472994 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" event={"ID":"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4","Type":"ContainerStarted","Data":"adb90435f6854ff94f2f5c446f67d2957fc04cbce48c63232670881dbd846cfe"} Dec 01 09:57:42 crc kubenswrapper[4763]: I1201 09:57:42.493585 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" event={"ID":"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4","Type":"ContainerStarted","Data":"289a76fe3c3a0f6605d1d6745b1aa1148704cfabe2220840ef462bb82f5d4045"} Dec 01 09:57:42 crc kubenswrapper[4763]: I1201 09:57:42.519587 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" podStartSLOduration=1.913740622 podStartE2EDuration="3.519558846s" podCreationTimestamp="2025-12-01 09:57:39 +0000 UTC" firstStartedPulling="2025-12-01 09:57:40.455482459 +0000 UTC m=+2577.724131227" lastFinishedPulling="2025-12-01 09:57:42.061300683 +0000 UTC m=+2579.329949451" observedRunningTime="2025-12-01 09:57:42.512134933 +0000 UTC m=+2579.780783701" watchObservedRunningTime="2025-12-01 09:57:42.519558846 +0000 UTC m=+2579.788207614" Dec 01 09:57:49 crc kubenswrapper[4763]: I1201 09:57:49.993902 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:57:49 crc kubenswrapper[4763]: E1201 09:57:49.994728 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:58:04 crc kubenswrapper[4763]: I1201 09:58:04.993650 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:58:04 crc kubenswrapper[4763]: E1201 09:58:04.995296 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:58:15 crc kubenswrapper[4763]: I1201 09:58:15.773963 4763 generic.go:334] "Generic (PLEG): container finished" podID="7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" containerID="289a76fe3c3a0f6605d1d6745b1aa1148704cfabe2220840ef462bb82f5d4045" exitCode=0 Dec 01 09:58:15 crc kubenswrapper[4763]: I1201 09:58:15.774038 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" event={"ID":"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4","Type":"ContainerDied","Data":"289a76fe3c3a0f6605d1d6745b1aa1148704cfabe2220840ef462bb82f5d4045"} Dec 01 09:58:15 crc kubenswrapper[4763]: I1201 09:58:15.994569 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:58:15 crc kubenswrapper[4763]: E1201 09:58:15.994844 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.186265 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356166 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356281 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktzvb\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-kube-api-access-ktzvb\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356319 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ssh-key\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356359 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356480 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-inventory\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356514 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-libvirt-combined-ca-bundle\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356559 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-neutron-metadata-combined-ca-bundle\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356585 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-nova-combined-ca-bundle\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356719 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-bootstrap-combined-ca-bundle\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356754 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ovn-combined-ca-bundle\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356787 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-repo-setup-combined-ca-bundle\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356825 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ceph\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.356853 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\" (UID: \"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4\") " Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.362934 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.364187 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.365436 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.365542 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.366105 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ceph" (OuterVolumeSpecName: "ceph") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.366794 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-kube-api-access-ktzvb" (OuterVolumeSpecName: "kube-api-access-ktzvb") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "kube-api-access-ktzvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.367734 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.369248 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.370189 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.371583 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.372991 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.388359 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.403921 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-inventory" (OuterVolumeSpecName: "inventory") pod "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" (UID: "7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458777 4763 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458813 4763 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458823 4763 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458832 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458845 4763 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458854 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458864 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458875 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458884 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktzvb\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-kube-api-access-ktzvb\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458892 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458901 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458909 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.458917 4763 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.793031 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" event={"ID":"7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4","Type":"ContainerDied","Data":"adb90435f6854ff94f2f5c446f67d2957fc04cbce48c63232670881dbd846cfe"} Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.793067 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adb90435f6854ff94f2f5c446f67d2957fc04cbce48c63232670881dbd846cfe" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.793075 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.898088 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc"] Dec 01 09:58:17 crc kubenswrapper[4763]: E1201 09:58:17.898482 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.898499 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.898694 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.899274 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.901175 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.902629 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.904096 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.905687 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.905819 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:58:17 crc kubenswrapper[4763]: I1201 09:58:17.917349 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc"] Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.068983 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.069071 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.069149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.069211 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22kp\" (UniqueName: \"kubernetes.io/projected/e060abda-70ed-4adb-8756-15046c2a2f9d-kube-api-access-t22kp\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.171412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.171575 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.171632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22kp\" (UniqueName: \"kubernetes.io/projected/e060abda-70ed-4adb-8756-15046c2a2f9d-kube-api-access-t22kp\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.171720 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.176259 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.182235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.183317 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.190936 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22kp\" (UniqueName: \"kubernetes.io/projected/e060abda-70ed-4adb-8756-15046c2a2f9d-kube-api-access-t22kp\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.223224 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.776413 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc"] Dec 01 09:58:18 crc kubenswrapper[4763]: I1201 09:58:18.803891 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" event={"ID":"e060abda-70ed-4adb-8756-15046c2a2f9d","Type":"ContainerStarted","Data":"26031fd27bc987dad633cca0bffccb2b66b5e33745c967250b365a56b01dca68"} Dec 01 09:58:20 crc kubenswrapper[4763]: I1201 09:58:20.822433 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" event={"ID":"e060abda-70ed-4adb-8756-15046c2a2f9d","Type":"ContainerStarted","Data":"26b781591d87519213c7fcca7568dee5a415445dad02933dd53d863786621ae4"} Dec 01 09:58:20 crc kubenswrapper[4763]: I1201 09:58:20.845569 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" podStartSLOduration=3.004929357 podStartE2EDuration="3.845546096s" podCreationTimestamp="2025-12-01 09:58:17 +0000 UTC" firstStartedPulling="2025-12-01 09:58:18.778159151 +0000 UTC m=+2616.046807919" lastFinishedPulling="2025-12-01 09:58:19.61877589 +0000 UTC m=+2616.887424658" observedRunningTime="2025-12-01 09:58:20.834794832 +0000 UTC m=+2618.103443610" watchObservedRunningTime="2025-12-01 09:58:20.845546096 +0000 UTC m=+2618.114194884" Dec 01 09:58:25 crc kubenswrapper[4763]: I1201 09:58:25.857811 4763 generic.go:334] "Generic (PLEG): container finished" podID="e060abda-70ed-4adb-8756-15046c2a2f9d" containerID="26b781591d87519213c7fcca7568dee5a415445dad02933dd53d863786621ae4" exitCode=0 Dec 01 09:58:25 crc kubenswrapper[4763]: I1201 09:58:25.857918 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" event={"ID":"e060abda-70ed-4adb-8756-15046c2a2f9d","Type":"ContainerDied","Data":"26b781591d87519213c7fcca7568dee5a415445dad02933dd53d863786621ae4"} Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.276391 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.436296 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ssh-key\") pod \"e060abda-70ed-4adb-8756-15046c2a2f9d\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.436354 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ceph\") pod \"e060abda-70ed-4adb-8756-15046c2a2f9d\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.436442 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-inventory\") pod \"e060abda-70ed-4adb-8756-15046c2a2f9d\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.436495 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t22kp\" (UniqueName: \"kubernetes.io/projected/e060abda-70ed-4adb-8756-15046c2a2f9d-kube-api-access-t22kp\") pod \"e060abda-70ed-4adb-8756-15046c2a2f9d\" (UID: \"e060abda-70ed-4adb-8756-15046c2a2f9d\") " Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.448719 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ceph" (OuterVolumeSpecName: "ceph") pod "e060abda-70ed-4adb-8756-15046c2a2f9d" (UID: "e060abda-70ed-4adb-8756-15046c2a2f9d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.448747 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e060abda-70ed-4adb-8756-15046c2a2f9d-kube-api-access-t22kp" (OuterVolumeSpecName: "kube-api-access-t22kp") pod "e060abda-70ed-4adb-8756-15046c2a2f9d" (UID: "e060abda-70ed-4adb-8756-15046c2a2f9d"). InnerVolumeSpecName "kube-api-access-t22kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.472758 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e060abda-70ed-4adb-8756-15046c2a2f9d" (UID: "e060abda-70ed-4adb-8756-15046c2a2f9d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.474239 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-inventory" (OuterVolumeSpecName: "inventory") pod "e060abda-70ed-4adb-8756-15046c2a2f9d" (UID: "e060abda-70ed-4adb-8756-15046c2a2f9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.539236 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.539580 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.539598 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e060abda-70ed-4adb-8756-15046c2a2f9d-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.539613 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t22kp\" (UniqueName: \"kubernetes.io/projected/e060abda-70ed-4adb-8756-15046c2a2f9d-kube-api-access-t22kp\") on node \"crc\" DevicePath \"\"" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.875956 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" event={"ID":"e060abda-70ed-4adb-8756-15046c2a2f9d","Type":"ContainerDied","Data":"26031fd27bc987dad633cca0bffccb2b66b5e33745c967250b365a56b01dca68"} Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.876017 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26031fd27bc987dad633cca0bffccb2b66b5e33745c967250b365a56b01dca68" Dec 01 09:58:27 crc kubenswrapper[4763]: I1201 09:58:27.876023 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.003177 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w"] Dec 01 09:58:28 crc kubenswrapper[4763]: E1201 09:58:28.003595 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e060abda-70ed-4adb-8756-15046c2a2f9d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.003613 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e060abda-70ed-4adb-8756-15046c2a2f9d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.003800 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e060abda-70ed-4adb-8756-15046c2a2f9d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.004375 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.005989 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.006597 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.006642 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.006760 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.010353 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.015836 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.025912 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w"] Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.149386 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.149436 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.149544 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.149582 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgcmd\" (UniqueName: \"kubernetes.io/projected/be3fab30-e99d-4b1a-ba2c-86326fbeb363-kube-api-access-wgcmd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.149643 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.149705 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.250778 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.250838 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgcmd\" (UniqueName: \"kubernetes.io/projected/be3fab30-e99d-4b1a-ba2c-86326fbeb363-kube-api-access-wgcmd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.250869 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.250908 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.250982 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.251003 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.252079 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.255275 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.262664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.263216 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.268634 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgcmd\" (UniqueName: \"kubernetes.io/projected/be3fab30-e99d-4b1a-ba2c-86326fbeb363-kube-api-access-wgcmd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.270749 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d82w\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.331992 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.863490 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w"] Dec 01 09:58:28 crc kubenswrapper[4763]: I1201 09:58:28.886179 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" event={"ID":"be3fab30-e99d-4b1a-ba2c-86326fbeb363","Type":"ContainerStarted","Data":"934a699564a56d20a4770468881d067ff78c48ee620444535051f9cfab98a1d2"} Dec 01 09:58:29 crc kubenswrapper[4763]: I1201 09:58:29.896049 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" event={"ID":"be3fab30-e99d-4b1a-ba2c-86326fbeb363","Type":"ContainerStarted","Data":"c535dd439ba609a982719ba5e918cf15bc0ce8e0f9bbd4098783ccea2583189d"} Dec 01 09:58:29 crc kubenswrapper[4763]: I1201 09:58:29.916338 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" podStartSLOduration=2.328550609 podStartE2EDuration="2.916320689s" podCreationTimestamp="2025-12-01 09:58:27 +0000 UTC" firstStartedPulling="2025-12-01 09:58:28.856888499 +0000 UTC m=+2626.125537267" lastFinishedPulling="2025-12-01 09:58:29.444658579 +0000 UTC m=+2626.713307347" observedRunningTime="2025-12-01 09:58:29.910920633 +0000 UTC m=+2627.179569401" watchObservedRunningTime="2025-12-01 09:58:29.916320689 +0000 UTC m=+2627.184969457" Dec 01 09:58:29 crc kubenswrapper[4763]: I1201 09:58:29.994472 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:58:29 crc kubenswrapper[4763]: E1201 09:58:29.994855 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:58:41 crc kubenswrapper[4763]: I1201 09:58:41.995219 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:58:41 crc kubenswrapper[4763]: E1201 09:58:41.995935 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:58:55 crc kubenswrapper[4763]: I1201 09:58:55.995172 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:58:55 crc kubenswrapper[4763]: E1201 09:58:55.995896 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:59:09 crc kubenswrapper[4763]: I1201 09:59:08.993332 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:59:09 crc kubenswrapper[4763]: E1201 09:59:08.994088 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:59:21 crc kubenswrapper[4763]: I1201 09:59:21.994585 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:59:21 crc kubenswrapper[4763]: E1201 09:59:21.995253 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 09:59:36 crc kubenswrapper[4763]: I1201 09:59:36.994443 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 09:59:37 crc kubenswrapper[4763]: I1201 09:59:37.614247 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"f702c11f8b0131f090364f487a6cf11628da1960227361edee9a7525bb67ff62"} Dec 01 09:59:49 crc kubenswrapper[4763]: I1201 09:59:49.717534 4763 generic.go:334] "Generic (PLEG): container finished" podID="be3fab30-e99d-4b1a-ba2c-86326fbeb363" containerID="c535dd439ba609a982719ba5e918cf15bc0ce8e0f9bbd4098783ccea2583189d" exitCode=0 Dec 01 09:59:49 crc kubenswrapper[4763]: I1201 09:59:49.717619 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" event={"ID":"be3fab30-e99d-4b1a-ba2c-86326fbeb363","Type":"ContainerDied","Data":"c535dd439ba609a982719ba5e918cf15bc0ce8e0f9bbd4098783ccea2583189d"} Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.254044 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.425635 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ceph\") pod \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.425751 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovn-combined-ca-bundle\") pod \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.425915 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovncontroller-config-0\") pod \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.425948 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgcmd\" (UniqueName: \"kubernetes.io/projected/be3fab30-e99d-4b1a-ba2c-86326fbeb363-kube-api-access-wgcmd\") pod \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.425978 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-inventory\") pod \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.426032 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ssh-key\") pod \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\" (UID: \"be3fab30-e99d-4b1a-ba2c-86326fbeb363\") " Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.432104 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "be3fab30-e99d-4b1a-ba2c-86326fbeb363" (UID: "be3fab30-e99d-4b1a-ba2c-86326fbeb363"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.433419 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ceph" (OuterVolumeSpecName: "ceph") pod "be3fab30-e99d-4b1a-ba2c-86326fbeb363" (UID: "be3fab30-e99d-4b1a-ba2c-86326fbeb363"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.437079 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3fab30-e99d-4b1a-ba2c-86326fbeb363-kube-api-access-wgcmd" (OuterVolumeSpecName: "kube-api-access-wgcmd") pod "be3fab30-e99d-4b1a-ba2c-86326fbeb363" (UID: "be3fab30-e99d-4b1a-ba2c-86326fbeb363"). InnerVolumeSpecName "kube-api-access-wgcmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.456016 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "be3fab30-e99d-4b1a-ba2c-86326fbeb363" (UID: "be3fab30-e99d-4b1a-ba2c-86326fbeb363"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.469886 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "be3fab30-e99d-4b1a-ba2c-86326fbeb363" (UID: "be3fab30-e99d-4b1a-ba2c-86326fbeb363"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.473723 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-inventory" (OuterVolumeSpecName: "inventory") pod "be3fab30-e99d-4b1a-ba2c-86326fbeb363" (UID: "be3fab30-e99d-4b1a-ba2c-86326fbeb363"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.527798 4763 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.527836 4763 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.527851 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgcmd\" (UniqueName: \"kubernetes.io/projected/be3fab30-e99d-4b1a-ba2c-86326fbeb363-kube-api-access-wgcmd\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.527863 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.527877 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.527889 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be3fab30-e99d-4b1a-ba2c-86326fbeb363-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.734062 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" event={"ID":"be3fab30-e99d-4b1a-ba2c-86326fbeb363","Type":"ContainerDied","Data":"934a699564a56d20a4770468881d067ff78c48ee620444535051f9cfab98a1d2"} Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.734105 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d82w" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.734108 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="934a699564a56d20a4770468881d067ff78c48ee620444535051f9cfab98a1d2" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.838534 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk"] Dec 01 09:59:51 crc kubenswrapper[4763]: E1201 09:59:51.838866 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3fab30-e99d-4b1a-ba2c-86326fbeb363" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.838882 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3fab30-e99d-4b1a-ba2c-86326fbeb363" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.839070 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3fab30-e99d-4b1a-ba2c-86326fbeb363" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.839684 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.842063 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.842670 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.842729 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.842748 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.842792 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.842815 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.843617 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.857360 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk"] Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.934558 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.934621 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.934686 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.934763 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.934794 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.934939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcdgt\" (UniqueName: \"kubernetes.io/projected/a6949d90-ef2d-4555-87b8-0929fd2048b4-kube-api-access-mcdgt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:51 crc kubenswrapper[4763]: I1201 09:59:51.935021 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.037216 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.037280 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.037318 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.037342 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.037371 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.037468 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcdgt\" (UniqueName: \"kubernetes.io/projected/a6949d90-ef2d-4555-87b8-0929fd2048b4-kube-api-access-mcdgt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.037500 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.041550 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.042520 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.042767 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.044394 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.061796 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.061983 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.074741 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcdgt\" (UniqueName: \"kubernetes.io/projected/a6949d90-ef2d-4555-87b8-0929fd2048b4-kube-api-access-mcdgt\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.154683 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.705482 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk"] Dec 01 09:59:52 crc kubenswrapper[4763]: I1201 09:59:52.744732 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" event={"ID":"a6949d90-ef2d-4555-87b8-0929fd2048b4","Type":"ContainerStarted","Data":"e75c22c38da16834a78eb109788bd93598752932eb701299e863daed510ee555"} Dec 01 09:59:53 crc kubenswrapper[4763]: I1201 09:59:53.756688 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" event={"ID":"a6949d90-ef2d-4555-87b8-0929fd2048b4","Type":"ContainerStarted","Data":"d9c8e4cccfc506ada9897db61865aa280d7285c46a0e1df66aca07d8e2f03d54"} Dec 01 09:59:53 crc kubenswrapper[4763]: I1201 09:59:53.807193 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" podStartSLOduration=2.229795093 podStartE2EDuration="2.807166198s" podCreationTimestamp="2025-12-01 09:59:51 +0000 UTC" firstStartedPulling="2025-12-01 09:59:52.721974354 +0000 UTC m=+2709.990623112" lastFinishedPulling="2025-12-01 09:59:53.299345449 +0000 UTC m=+2710.567994217" observedRunningTime="2025-12-01 09:59:53.793938036 +0000 UTC m=+2711.062586804" watchObservedRunningTime="2025-12-01 09:59:53.807166198 +0000 UTC m=+2711.075814976" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.162241 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6"] Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.164270 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.167236 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.168236 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.173965 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6"] Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.181133 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/304258b8-4b3c-4ec7-833e-ecd8c317f947-secret-volume\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.181240 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/304258b8-4b3c-4ec7-833e-ecd8c317f947-config-volume\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.181320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckh7f\" (UniqueName: \"kubernetes.io/projected/304258b8-4b3c-4ec7-833e-ecd8c317f947-kube-api-access-ckh7f\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.283713 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/304258b8-4b3c-4ec7-833e-ecd8c317f947-config-volume\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.283822 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckh7f\" (UniqueName: \"kubernetes.io/projected/304258b8-4b3c-4ec7-833e-ecd8c317f947-kube-api-access-ckh7f\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.283974 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/304258b8-4b3c-4ec7-833e-ecd8c317f947-secret-volume\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.285006 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/304258b8-4b3c-4ec7-833e-ecd8c317f947-config-volume\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.293408 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/304258b8-4b3c-4ec7-833e-ecd8c317f947-secret-volume\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.299824 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckh7f\" (UniqueName: \"kubernetes.io/projected/304258b8-4b3c-4ec7-833e-ecd8c317f947-kube-api-access-ckh7f\") pod \"collect-profiles-29409720-qgdm6\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:00 crc kubenswrapper[4763]: I1201 10:00:00.485331 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:01 crc kubenswrapper[4763]: I1201 10:00:01.023943 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6"] Dec 01 10:00:01 crc kubenswrapper[4763]: I1201 10:00:01.820172 4763 generic.go:334] "Generic (PLEG): container finished" podID="304258b8-4b3c-4ec7-833e-ecd8c317f947" containerID="a247fcd89d16c5676fc6f851d345fc41d3cb0f5447d4d6980636fae1d7ee910e" exitCode=0 Dec 01 10:00:01 crc kubenswrapper[4763]: I1201 10:00:01.820279 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" event={"ID":"304258b8-4b3c-4ec7-833e-ecd8c317f947","Type":"ContainerDied","Data":"a247fcd89d16c5676fc6f851d345fc41d3cb0f5447d4d6980636fae1d7ee910e"} Dec 01 10:00:01 crc kubenswrapper[4763]: I1201 10:00:01.820465 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" event={"ID":"304258b8-4b3c-4ec7-833e-ecd8c317f947","Type":"ContainerStarted","Data":"6aceca356f1d28ecd96b6ea4a4af121c9d471ed9fea86b92f18d08da48809b00"} Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.290580 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.450002 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckh7f\" (UniqueName: \"kubernetes.io/projected/304258b8-4b3c-4ec7-833e-ecd8c317f947-kube-api-access-ckh7f\") pod \"304258b8-4b3c-4ec7-833e-ecd8c317f947\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.450185 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/304258b8-4b3c-4ec7-833e-ecd8c317f947-secret-volume\") pod \"304258b8-4b3c-4ec7-833e-ecd8c317f947\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.450300 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/304258b8-4b3c-4ec7-833e-ecd8c317f947-config-volume\") pod \"304258b8-4b3c-4ec7-833e-ecd8c317f947\" (UID: \"304258b8-4b3c-4ec7-833e-ecd8c317f947\") " Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.451208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304258b8-4b3c-4ec7-833e-ecd8c317f947-config-volume" (OuterVolumeSpecName: "config-volume") pod "304258b8-4b3c-4ec7-833e-ecd8c317f947" (UID: "304258b8-4b3c-4ec7-833e-ecd8c317f947"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.468712 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304258b8-4b3c-4ec7-833e-ecd8c317f947-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "304258b8-4b3c-4ec7-833e-ecd8c317f947" (UID: "304258b8-4b3c-4ec7-833e-ecd8c317f947"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.468756 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304258b8-4b3c-4ec7-833e-ecd8c317f947-kube-api-access-ckh7f" (OuterVolumeSpecName: "kube-api-access-ckh7f") pod "304258b8-4b3c-4ec7-833e-ecd8c317f947" (UID: "304258b8-4b3c-4ec7-833e-ecd8c317f947"). InnerVolumeSpecName "kube-api-access-ckh7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.551682 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/304258b8-4b3c-4ec7-833e-ecd8c317f947-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.551719 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/304258b8-4b3c-4ec7-833e-ecd8c317f947-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.551734 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckh7f\" (UniqueName: \"kubernetes.io/projected/304258b8-4b3c-4ec7-833e-ecd8c317f947-kube-api-access-ckh7f\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.838017 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" event={"ID":"304258b8-4b3c-4ec7-833e-ecd8c317f947","Type":"ContainerDied","Data":"6aceca356f1d28ecd96b6ea4a4af121c9d471ed9fea86b92f18d08da48809b00"} Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.838054 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aceca356f1d28ecd96b6ea4a4af121c9d471ed9fea86b92f18d08da48809b00" Dec 01 10:00:03 crc kubenswrapper[4763]: I1201 10:00:03.838067 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6" Dec 01 10:00:04 crc kubenswrapper[4763]: I1201 10:00:04.429838 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv"] Dec 01 10:00:04 crc kubenswrapper[4763]: I1201 10:00:04.438551 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409675-t54dv"] Dec 01 10:00:05 crc kubenswrapper[4763]: I1201 10:00:05.006260 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ce1090-925c-45cc-b797-a08ddbe3dd98" path="/var/lib/kubelet/pods/34ce1090-925c-45cc-b797-a08ddbe3dd98/volumes" Dec 01 10:00:05 crc kubenswrapper[4763]: I1201 10:00:05.278598 4763 scope.go:117] "RemoveContainer" containerID="386b20a38dd799fc39d9e66afa072e549e8356a96ece9fe4dace2888470eb5d6" Dec 01 10:00:22 crc kubenswrapper[4763]: I1201 10:00:22.858217 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jv66w"] Dec 01 10:00:22 crc kubenswrapper[4763]: E1201 10:00:22.859150 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304258b8-4b3c-4ec7-833e-ecd8c317f947" containerName="collect-profiles" Dec 01 10:00:22 crc kubenswrapper[4763]: I1201 10:00:22.859163 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="304258b8-4b3c-4ec7-833e-ecd8c317f947" containerName="collect-profiles" Dec 01 10:00:22 crc kubenswrapper[4763]: I1201 10:00:22.859348 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="304258b8-4b3c-4ec7-833e-ecd8c317f947" containerName="collect-profiles" Dec 01 10:00:22 crc kubenswrapper[4763]: I1201 10:00:22.861049 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:22 crc kubenswrapper[4763]: I1201 10:00:22.877398 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv66w"] Dec 01 10:00:22 crc kubenswrapper[4763]: I1201 10:00:22.925814 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-utilities\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:22 crc kubenswrapper[4763]: I1201 10:00:22.925863 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-catalog-content\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:22 crc kubenswrapper[4763]: I1201 10:00:22.925925 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qps\" (UniqueName: \"kubernetes.io/projected/86f13f84-a659-4136-a8f8-8278c1c818e5-kube-api-access-w5qps\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:23 crc kubenswrapper[4763]: I1201 10:00:23.027603 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5qps\" (UniqueName: \"kubernetes.io/projected/86f13f84-a659-4136-a8f8-8278c1c818e5-kube-api-access-w5qps\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:23 crc kubenswrapper[4763]: I1201 10:00:23.027792 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-utilities\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:23 crc kubenswrapper[4763]: I1201 10:00:23.027818 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-catalog-content\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:23 crc kubenswrapper[4763]: I1201 10:00:23.028953 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-catalog-content\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:23 crc kubenswrapper[4763]: I1201 10:00:23.029048 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-utilities\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:23 crc kubenswrapper[4763]: I1201 10:00:23.060927 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5qps\" (UniqueName: \"kubernetes.io/projected/86f13f84-a659-4136-a8f8-8278c1c818e5-kube-api-access-w5qps\") pod \"redhat-marketplace-jv66w\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:23 crc kubenswrapper[4763]: I1201 10:00:23.181171 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:23 crc kubenswrapper[4763]: I1201 10:00:23.674762 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv66w"] Dec 01 10:00:24 crc kubenswrapper[4763]: I1201 10:00:24.020439 4763 generic.go:334] "Generic (PLEG): container finished" podID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerID="8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668" exitCode=0 Dec 01 10:00:24 crc kubenswrapper[4763]: I1201 10:00:24.020532 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv66w" event={"ID":"86f13f84-a659-4136-a8f8-8278c1c818e5","Type":"ContainerDied","Data":"8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668"} Dec 01 10:00:24 crc kubenswrapper[4763]: I1201 10:00:24.020762 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv66w" event={"ID":"86f13f84-a659-4136-a8f8-8278c1c818e5","Type":"ContainerStarted","Data":"9201bfafd590fae1de378f8920d647a78137d55c4fb109f833ae1bb9778f9126"} Dec 01 10:00:24 crc kubenswrapper[4763]: I1201 10:00:24.022283 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:00:25 crc kubenswrapper[4763]: I1201 10:00:25.032989 4763 generic.go:334] "Generic (PLEG): container finished" podID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerID="af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99" exitCode=0 Dec 01 10:00:25 crc kubenswrapper[4763]: I1201 10:00:25.033075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv66w" event={"ID":"86f13f84-a659-4136-a8f8-8278c1c818e5","Type":"ContainerDied","Data":"af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99"} Dec 01 10:00:27 crc kubenswrapper[4763]: I1201 10:00:27.055186 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv66w" event={"ID":"86f13f84-a659-4136-a8f8-8278c1c818e5","Type":"ContainerStarted","Data":"751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a"} Dec 01 10:00:27 crc kubenswrapper[4763]: I1201 10:00:27.077653 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jv66w" podStartSLOduration=3.010813792 podStartE2EDuration="5.077631808s" podCreationTimestamp="2025-12-01 10:00:22 +0000 UTC" firstStartedPulling="2025-12-01 10:00:24.022056493 +0000 UTC m=+2741.290705251" lastFinishedPulling="2025-12-01 10:00:26.088874499 +0000 UTC m=+2743.357523267" observedRunningTime="2025-12-01 10:00:27.076500757 +0000 UTC m=+2744.345149525" watchObservedRunningTime="2025-12-01 10:00:27.077631808 +0000 UTC m=+2744.346280576" Dec 01 10:00:33 crc kubenswrapper[4763]: I1201 10:00:33.183096 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:33 crc kubenswrapper[4763]: I1201 10:00:33.183839 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:33 crc kubenswrapper[4763]: I1201 10:00:33.232009 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:34 crc kubenswrapper[4763]: I1201 10:00:34.148201 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:34 crc kubenswrapper[4763]: I1201 10:00:34.199506 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv66w"] Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.118514 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jv66w" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerName="registry-server" containerID="cri-o://751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a" gracePeriod=2 Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.589217 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.707304 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-catalog-content\") pod \"86f13f84-a659-4136-a8f8-8278c1c818e5\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.707377 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-utilities\") pod \"86f13f84-a659-4136-a8f8-8278c1c818e5\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.707504 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5qps\" (UniqueName: \"kubernetes.io/projected/86f13f84-a659-4136-a8f8-8278c1c818e5-kube-api-access-w5qps\") pod \"86f13f84-a659-4136-a8f8-8278c1c818e5\" (UID: \"86f13f84-a659-4136-a8f8-8278c1c818e5\") " Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.708276 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-utilities" (OuterVolumeSpecName: "utilities") pod "86f13f84-a659-4136-a8f8-8278c1c818e5" (UID: "86f13f84-a659-4136-a8f8-8278c1c818e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.713671 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f13f84-a659-4136-a8f8-8278c1c818e5-kube-api-access-w5qps" (OuterVolumeSpecName: "kube-api-access-w5qps") pod "86f13f84-a659-4136-a8f8-8278c1c818e5" (UID: "86f13f84-a659-4136-a8f8-8278c1c818e5"). InnerVolumeSpecName "kube-api-access-w5qps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.728102 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86f13f84-a659-4136-a8f8-8278c1c818e5" (UID: "86f13f84-a659-4136-a8f8-8278c1c818e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.809083 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5qps\" (UniqueName: \"kubernetes.io/projected/86f13f84-a659-4136-a8f8-8278c1c818e5-kube-api-access-w5qps\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.809139 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:36 crc kubenswrapper[4763]: I1201 10:00:36.809151 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f13f84-a659-4136-a8f8-8278c1c818e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.129310 4763 generic.go:334] "Generic (PLEG): container finished" podID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerID="751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a" exitCode=0 Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.129388 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv66w" event={"ID":"86f13f84-a659-4136-a8f8-8278c1c818e5","Type":"ContainerDied","Data":"751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a"} Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.129422 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jv66w" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.129448 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jv66w" event={"ID":"86f13f84-a659-4136-a8f8-8278c1c818e5","Type":"ContainerDied","Data":"9201bfafd590fae1de378f8920d647a78137d55c4fb109f833ae1bb9778f9126"} Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.129558 4763 scope.go:117] "RemoveContainer" containerID="751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.151797 4763 scope.go:117] "RemoveContainer" containerID="af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.155706 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv66w"] Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.164742 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jv66w"] Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.183122 4763 scope.go:117] "RemoveContainer" containerID="8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.210173 4763 scope.go:117] "RemoveContainer" containerID="751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a" Dec 01 10:00:37 crc kubenswrapper[4763]: E1201 10:00:37.210624 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a\": container with ID starting with 751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a not found: ID does not exist" containerID="751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.210651 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a"} err="failed to get container status \"751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a\": rpc error: code = NotFound desc = could not find container \"751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a\": container with ID starting with 751b216ec8e08f010a0c278e1068173e351889f04d5293e9071267dddaf45d6a not found: ID does not exist" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.210674 4763 scope.go:117] "RemoveContainer" containerID="af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99" Dec 01 10:00:37 crc kubenswrapper[4763]: E1201 10:00:37.211034 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99\": container with ID starting with af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99 not found: ID does not exist" containerID="af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.211054 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99"} err="failed to get container status \"af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99\": rpc error: code = NotFound desc = could not find container \"af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99\": container with ID starting with af5dc3738a67be313affc7a84775daaccd7ed37d19e2945ac4ae1217aabb6b99 not found: ID does not exist" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.211065 4763 scope.go:117] "RemoveContainer" containerID="8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668" Dec 01 10:00:37 crc kubenswrapper[4763]: E1201 10:00:37.211265 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668\": container with ID starting with 8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668 not found: ID does not exist" containerID="8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668" Dec 01 10:00:37 crc kubenswrapper[4763]: I1201 10:00:37.211283 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668"} err="failed to get container status \"8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668\": rpc error: code = NotFound desc = could not find container \"8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668\": container with ID starting with 8be24d9f6493238c8f22771d9e18fb842b7e0e842d8a8a4f6d8801d87e076668 not found: ID does not exist" Dec 01 10:00:39 crc kubenswrapper[4763]: I1201 10:00:39.004963 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" path="/var/lib/kubelet/pods/86f13f84-a659-4136-a8f8-8278c1c818e5/volumes" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.174979 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29409721-pwb6r"] Dec 01 10:01:00 crc kubenswrapper[4763]: E1201 10:01:00.175951 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerName="registry-server" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.175967 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerName="registry-server" Dec 01 10:01:00 crc kubenswrapper[4763]: E1201 10:01:00.176001 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerName="extract-content" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.176007 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerName="extract-content" Dec 01 10:01:00 crc kubenswrapper[4763]: E1201 10:01:00.176026 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerName="extract-utilities" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.176033 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerName="extract-utilities" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.176186 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f13f84-a659-4136-a8f8-8278c1c818e5" containerName="registry-server" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.177162 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.190509 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409721-pwb6r"] Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.274513 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s68v\" (UniqueName: \"kubernetes.io/projected/80866302-0e51-4cd8-8fb1-5106a5764fb8-kube-api-access-7s68v\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.274658 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-fernet-keys\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.274689 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-config-data\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.274712 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-combined-ca-bundle\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.375999 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-fernet-keys\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.376059 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-config-data\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.376086 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-combined-ca-bundle\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.376217 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s68v\" (UniqueName: \"kubernetes.io/projected/80866302-0e51-4cd8-8fb1-5106a5764fb8-kube-api-access-7s68v\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.383447 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-fernet-keys\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.384179 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-config-data\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.386201 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-combined-ca-bundle\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.397269 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s68v\" (UniqueName: \"kubernetes.io/projected/80866302-0e51-4cd8-8fb1-5106a5764fb8-kube-api-access-7s68v\") pod \"keystone-cron-29409721-pwb6r\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.503098 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:00 crc kubenswrapper[4763]: I1201 10:01:00.971066 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29409721-pwb6r"] Dec 01 10:01:01 crc kubenswrapper[4763]: I1201 10:01:01.373036 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-pwb6r" event={"ID":"80866302-0e51-4cd8-8fb1-5106a5764fb8","Type":"ContainerStarted","Data":"a7880943252b8a1cd40f41788807f6cfa1a0efce071d41b0f5b60ce6118fb845"} Dec 01 10:01:01 crc kubenswrapper[4763]: I1201 10:01:01.373305 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-pwb6r" event={"ID":"80866302-0e51-4cd8-8fb1-5106a5764fb8","Type":"ContainerStarted","Data":"0f0720af46088b239509bd14d1bb167207df8b7d656a83305d8efc3214ab5fa0"} Dec 01 10:01:04 crc kubenswrapper[4763]: I1201 10:01:04.398925 4763 generic.go:334] "Generic (PLEG): container finished" podID="a6949d90-ef2d-4555-87b8-0929fd2048b4" containerID="d9c8e4cccfc506ada9897db61865aa280d7285c46a0e1df66aca07d8e2f03d54" exitCode=0 Dec 01 10:01:04 crc kubenswrapper[4763]: I1201 10:01:04.398981 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" event={"ID":"a6949d90-ef2d-4555-87b8-0929fd2048b4","Type":"ContainerDied","Data":"d9c8e4cccfc506ada9897db61865aa280d7285c46a0e1df66aca07d8e2f03d54"} Dec 01 10:01:04 crc kubenswrapper[4763]: I1201 10:01:04.401969 4763 generic.go:334] "Generic (PLEG): container finished" podID="80866302-0e51-4cd8-8fb1-5106a5764fb8" containerID="a7880943252b8a1cd40f41788807f6cfa1a0efce071d41b0f5b60ce6118fb845" exitCode=0 Dec 01 10:01:04 crc kubenswrapper[4763]: I1201 10:01:04.402005 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-pwb6r" event={"ID":"80866302-0e51-4cd8-8fb1-5106a5764fb8","Type":"ContainerDied","Data":"a7880943252b8a1cd40f41788807f6cfa1a0efce071d41b0f5b60ce6118fb845"} Dec 01 10:01:04 crc kubenswrapper[4763]: I1201 10:01:04.445251 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29409721-pwb6r" podStartSLOduration=4.445230609 podStartE2EDuration="4.445230609s" podCreationTimestamp="2025-12-01 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:01:01.423233386 +0000 UTC m=+2778.691882164" watchObservedRunningTime="2025-12-01 10:01:04.445230609 +0000 UTC m=+2781.713879367" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.790765 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.878497 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-fernet-keys\") pod \"80866302-0e51-4cd8-8fb1-5106a5764fb8\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.878632 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-combined-ca-bundle\") pod \"80866302-0e51-4cd8-8fb1-5106a5764fb8\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.878660 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-config-data\") pod \"80866302-0e51-4cd8-8fb1-5106a5764fb8\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.878733 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s68v\" (UniqueName: \"kubernetes.io/projected/80866302-0e51-4cd8-8fb1-5106a5764fb8-kube-api-access-7s68v\") pod \"80866302-0e51-4cd8-8fb1-5106a5764fb8\" (UID: \"80866302-0e51-4cd8-8fb1-5106a5764fb8\") " Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.884631 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80866302-0e51-4cd8-8fb1-5106a5764fb8-kube-api-access-7s68v" (OuterVolumeSpecName: "kube-api-access-7s68v") pod "80866302-0e51-4cd8-8fb1-5106a5764fb8" (UID: "80866302-0e51-4cd8-8fb1-5106a5764fb8"). InnerVolumeSpecName "kube-api-access-7s68v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.890048 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "80866302-0e51-4cd8-8fb1-5106a5764fb8" (UID: "80866302-0e51-4cd8-8fb1-5106a5764fb8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.913010 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80866302-0e51-4cd8-8fb1-5106a5764fb8" (UID: "80866302-0e51-4cd8-8fb1-5106a5764fb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.942618 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.946849 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-config-data" (OuterVolumeSpecName: "config-data") pod "80866302-0e51-4cd8-8fb1-5106a5764fb8" (UID: "80866302-0e51-4cd8-8fb1-5106a5764fb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.982198 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.982239 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.983317 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s68v\" (UniqueName: \"kubernetes.io/projected/80866302-0e51-4cd8-8fb1-5106a5764fb8-kube-api-access-7s68v\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:05 crc kubenswrapper[4763]: I1201 10:01:05.983342 4763 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80866302-0e51-4cd8-8fb1-5106a5764fb8-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.085288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ssh-key\") pod \"a6949d90-ef2d-4555-87b8-0929fd2048b4\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.085436 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-metadata-combined-ca-bundle\") pod \"a6949d90-ef2d-4555-87b8-0929fd2048b4\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.085664 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcdgt\" (UniqueName: \"kubernetes.io/projected/a6949d90-ef2d-4555-87b8-0929fd2048b4-kube-api-access-mcdgt\") pod \"a6949d90-ef2d-4555-87b8-0929fd2048b4\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.085701 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ceph\") pod \"a6949d90-ef2d-4555-87b8-0929fd2048b4\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.085824 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-nova-metadata-neutron-config-0\") pod \"a6949d90-ef2d-4555-87b8-0929fd2048b4\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.085861 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-inventory\") pod \"a6949d90-ef2d-4555-87b8-0929fd2048b4\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.085954 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a6949d90-ef2d-4555-87b8-0929fd2048b4\" (UID: \"a6949d90-ef2d-4555-87b8-0929fd2048b4\") " Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.089117 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6949d90-ef2d-4555-87b8-0929fd2048b4-kube-api-access-mcdgt" (OuterVolumeSpecName: "kube-api-access-mcdgt") pod "a6949d90-ef2d-4555-87b8-0929fd2048b4" (UID: "a6949d90-ef2d-4555-87b8-0929fd2048b4"). InnerVolumeSpecName "kube-api-access-mcdgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.089753 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a6949d90-ef2d-4555-87b8-0929fd2048b4" (UID: "a6949d90-ef2d-4555-87b8-0929fd2048b4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.092092 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ceph" (OuterVolumeSpecName: "ceph") pod "a6949d90-ef2d-4555-87b8-0929fd2048b4" (UID: "a6949d90-ef2d-4555-87b8-0929fd2048b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.112931 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-inventory" (OuterVolumeSpecName: "inventory") pod "a6949d90-ef2d-4555-87b8-0929fd2048b4" (UID: "a6949d90-ef2d-4555-87b8-0929fd2048b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.114430 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a6949d90-ef2d-4555-87b8-0929fd2048b4" (UID: "a6949d90-ef2d-4555-87b8-0929fd2048b4"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.124456 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a6949d90-ef2d-4555-87b8-0929fd2048b4" (UID: "a6949d90-ef2d-4555-87b8-0929fd2048b4"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.124612 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a6949d90-ef2d-4555-87b8-0929fd2048b4" (UID: "a6949d90-ef2d-4555-87b8-0929fd2048b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.188911 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcdgt\" (UniqueName: \"kubernetes.io/projected/a6949d90-ef2d-4555-87b8-0929fd2048b4-kube-api-access-mcdgt\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.188940 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.188949 4763 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.188960 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.188970 4763 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.188981 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.188989 4763 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6949d90-ef2d-4555-87b8-0929fd2048b4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.423409 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" event={"ID":"a6949d90-ef2d-4555-87b8-0929fd2048b4","Type":"ContainerDied","Data":"e75c22c38da16834a78eb109788bd93598752932eb701299e863daed510ee555"} Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.423438 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.423448 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e75c22c38da16834a78eb109788bd93598752932eb701299e863daed510ee555" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.426057 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29409721-pwb6r" event={"ID":"80866302-0e51-4cd8-8fb1-5106a5764fb8","Type":"ContainerDied","Data":"0f0720af46088b239509bd14d1bb167207df8b7d656a83305d8efc3214ab5fa0"} Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.426103 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f0720af46088b239509bd14d1bb167207df8b7d656a83305d8efc3214ab5fa0" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.426181 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29409721-pwb6r" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.527736 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl"] Dec 01 10:01:06 crc kubenswrapper[4763]: E1201 10:01:06.528144 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6949d90-ef2d-4555-87b8-0929fd2048b4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.528171 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6949d90-ef2d-4555-87b8-0929fd2048b4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 10:01:06 crc kubenswrapper[4763]: E1201 10:01:06.528209 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80866302-0e51-4cd8-8fb1-5106a5764fb8" containerName="keystone-cron" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.528218 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="80866302-0e51-4cd8-8fb1-5106a5764fb8" containerName="keystone-cron" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.528420 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6949d90-ef2d-4555-87b8-0929fd2048b4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.528478 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="80866302-0e51-4cd8-8fb1-5106a5764fb8" containerName="keystone-cron" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.529213 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.531626 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.532082 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.532204 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.532243 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.532352 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.532448 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.546975 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl"] Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.596607 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkrp\" (UniqueName: \"kubernetes.io/projected/a458d267-3663-4b9e-baa3-c3711a334c80-kube-api-access-wwkrp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.596655 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.596684 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.596734 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.596797 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.596836 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.698563 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkrp\" (UniqueName: \"kubernetes.io/projected/a458d267-3663-4b9e-baa3-c3711a334c80-kube-api-access-wwkrp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.698892 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.698933 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.698984 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.699054 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.699101 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.703125 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.704566 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.707031 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.707721 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.711046 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.716127 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkrp\" (UniqueName: \"kubernetes.io/projected/a458d267-3663-4b9e-baa3-c3711a334c80-kube-api-access-wwkrp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:06 crc kubenswrapper[4763]: I1201 10:01:06.862623 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:01:07 crc kubenswrapper[4763]: I1201 10:01:07.394352 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl"] Dec 01 10:01:07 crc kubenswrapper[4763]: I1201 10:01:07.438743 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" event={"ID":"a458d267-3663-4b9e-baa3-c3711a334c80","Type":"ContainerStarted","Data":"717d1d8b1849fe77a83c5cc2ca85675b718496f98dda7523a74204fa1800de43"} Dec 01 10:01:08 crc kubenswrapper[4763]: I1201 10:01:08.447572 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" event={"ID":"a458d267-3663-4b9e-baa3-c3711a334c80","Type":"ContainerStarted","Data":"7bb8111fa53c249d2b9c3bcac349133c67f7358ff107c632748e0838c52032f4"} Dec 01 10:01:08 crc kubenswrapper[4763]: I1201 10:01:08.474982 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" podStartSLOduration=1.815640747 podStartE2EDuration="2.474966218s" podCreationTimestamp="2025-12-01 10:01:06 +0000 UTC" firstStartedPulling="2025-12-01 10:01:07.39292903 +0000 UTC m=+2784.661577808" lastFinishedPulling="2025-12-01 10:01:08.052254491 +0000 UTC m=+2785.320903279" observedRunningTime="2025-12-01 10:01:08.473970389 +0000 UTC m=+2785.742619157" watchObservedRunningTime="2025-12-01 10:01:08.474966218 +0000 UTC m=+2785.743614986" Dec 01 10:02:03 crc kubenswrapper[4763]: I1201 10:02:03.929894 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:02:03 crc kubenswrapper[4763]: I1201 10:02:03.930331 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:02:33 crc kubenswrapper[4763]: I1201 10:02:33.929633 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:02:33 crc kubenswrapper[4763]: I1201 10:02:33.930349 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:03:03 crc kubenswrapper[4763]: I1201 10:03:03.929521 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:03:03 crc kubenswrapper[4763]: I1201 10:03:03.930154 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:03:03 crc kubenswrapper[4763]: I1201 10:03:03.930204 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 10:03:03 crc kubenswrapper[4763]: I1201 10:03:03.930964 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f702c11f8b0131f090364f487a6cf11628da1960227361edee9a7525bb67ff62"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:03:03 crc kubenswrapper[4763]: I1201 10:03:03.931013 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://f702c11f8b0131f090364f487a6cf11628da1960227361edee9a7525bb67ff62" gracePeriod=600 Dec 01 10:03:04 crc kubenswrapper[4763]: I1201 10:03:04.528585 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="f702c11f8b0131f090364f487a6cf11628da1960227361edee9a7525bb67ff62" exitCode=0 Dec 01 10:03:04 crc kubenswrapper[4763]: I1201 10:03:04.528737 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"f702c11f8b0131f090364f487a6cf11628da1960227361edee9a7525bb67ff62"} Dec 01 10:03:04 crc kubenswrapper[4763]: I1201 10:03:04.529119 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb"} Dec 01 10:03:04 crc kubenswrapper[4763]: I1201 10:03:04.529141 4763 scope.go:117] "RemoveContainer" containerID="09bda49e99e6c58ea61603b14a079ada8d4631a2520c1c799cdbc96d2a04fba7" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.036615 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xgf9g"] Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.039656 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.060686 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgf9g"] Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.172292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-catalog-content\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.172437 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-utilities\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.172501 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6s9c\" (UniqueName: \"kubernetes.io/projected/896ff67c-defb-4799-8a02-b9b1d749fc1c-kube-api-access-c6s9c\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.274318 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-catalog-content\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.274386 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-utilities\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.274412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6s9c\" (UniqueName: \"kubernetes.io/projected/896ff67c-defb-4799-8a02-b9b1d749fc1c-kube-api-access-c6s9c\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.275245 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-catalog-content\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.275506 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-utilities\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.294750 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6s9c\" (UniqueName: \"kubernetes.io/projected/896ff67c-defb-4799-8a02-b9b1d749fc1c-kube-api-access-c6s9c\") pod \"certified-operators-xgf9g\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.370789 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:19 crc kubenswrapper[4763]: I1201 10:03:19.974774 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xgf9g"] Dec 01 10:03:20 crc kubenswrapper[4763]: I1201 10:03:20.681218 4763 generic.go:334] "Generic (PLEG): container finished" podID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerID="957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce" exitCode=0 Dec 01 10:03:20 crc kubenswrapper[4763]: I1201 10:03:20.681273 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgf9g" event={"ID":"896ff67c-defb-4799-8a02-b9b1d749fc1c","Type":"ContainerDied","Data":"957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce"} Dec 01 10:03:20 crc kubenswrapper[4763]: I1201 10:03:20.681570 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgf9g" event={"ID":"896ff67c-defb-4799-8a02-b9b1d749fc1c","Type":"ContainerStarted","Data":"663050e2c910d0c55757c63a08c95aa4b87e2c139eaf681cb47f128d96dfa923"} Dec 01 10:03:22 crc kubenswrapper[4763]: I1201 10:03:22.712676 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgf9g" event={"ID":"896ff67c-defb-4799-8a02-b9b1d749fc1c","Type":"ContainerStarted","Data":"1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1"} Dec 01 10:03:23 crc kubenswrapper[4763]: I1201 10:03:23.722237 4763 generic.go:334] "Generic (PLEG): container finished" podID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerID="1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1" exitCode=0 Dec 01 10:03:23 crc kubenswrapper[4763]: I1201 10:03:23.722283 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgf9g" event={"ID":"896ff67c-defb-4799-8a02-b9b1d749fc1c","Type":"ContainerDied","Data":"1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1"} Dec 01 10:03:24 crc kubenswrapper[4763]: I1201 10:03:24.732237 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgf9g" event={"ID":"896ff67c-defb-4799-8a02-b9b1d749fc1c","Type":"ContainerStarted","Data":"b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa"} Dec 01 10:03:24 crc kubenswrapper[4763]: I1201 10:03:24.758852 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xgf9g" podStartSLOduration=2.216319466 podStartE2EDuration="5.758832833s" podCreationTimestamp="2025-12-01 10:03:19 +0000 UTC" firstStartedPulling="2025-12-01 10:03:20.683651015 +0000 UTC m=+2917.952299783" lastFinishedPulling="2025-12-01 10:03:24.226164382 +0000 UTC m=+2921.494813150" observedRunningTime="2025-12-01 10:03:24.754525417 +0000 UTC m=+2922.023174185" watchObservedRunningTime="2025-12-01 10:03:24.758832833 +0000 UTC m=+2922.027481601" Dec 01 10:03:29 crc kubenswrapper[4763]: I1201 10:03:29.371175 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:29 crc kubenswrapper[4763]: I1201 10:03:29.371844 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:29 crc kubenswrapper[4763]: I1201 10:03:29.416958 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:29 crc kubenswrapper[4763]: I1201 10:03:29.813795 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:29 crc kubenswrapper[4763]: I1201 10:03:29.872030 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgf9g"] Dec 01 10:03:31 crc kubenswrapper[4763]: I1201 10:03:31.801909 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xgf9g" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerName="registry-server" containerID="cri-o://b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa" gracePeriod=2 Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.276971 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.386315 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-utilities\") pod \"896ff67c-defb-4799-8a02-b9b1d749fc1c\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.386426 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6s9c\" (UniqueName: \"kubernetes.io/projected/896ff67c-defb-4799-8a02-b9b1d749fc1c-kube-api-access-c6s9c\") pod \"896ff67c-defb-4799-8a02-b9b1d749fc1c\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.386607 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-catalog-content\") pod \"896ff67c-defb-4799-8a02-b9b1d749fc1c\" (UID: \"896ff67c-defb-4799-8a02-b9b1d749fc1c\") " Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.387499 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-utilities" (OuterVolumeSpecName: "utilities") pod "896ff67c-defb-4799-8a02-b9b1d749fc1c" (UID: "896ff67c-defb-4799-8a02-b9b1d749fc1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.394448 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896ff67c-defb-4799-8a02-b9b1d749fc1c-kube-api-access-c6s9c" (OuterVolumeSpecName: "kube-api-access-c6s9c") pod "896ff67c-defb-4799-8a02-b9b1d749fc1c" (UID: "896ff67c-defb-4799-8a02-b9b1d749fc1c"). InnerVolumeSpecName "kube-api-access-c6s9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.439157 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "896ff67c-defb-4799-8a02-b9b1d749fc1c" (UID: "896ff67c-defb-4799-8a02-b9b1d749fc1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.489344 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.489600 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/896ff67c-defb-4799-8a02-b9b1d749fc1c-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.490042 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6s9c\" (UniqueName: \"kubernetes.io/projected/896ff67c-defb-4799-8a02-b9b1d749fc1c-kube-api-access-c6s9c\") on node \"crc\" DevicePath \"\"" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.813970 4763 generic.go:334] "Generic (PLEG): container finished" podID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerID="b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa" exitCode=0 Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.814027 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgf9g" event={"ID":"896ff67c-defb-4799-8a02-b9b1d749fc1c","Type":"ContainerDied","Data":"b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa"} Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.814056 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xgf9g" event={"ID":"896ff67c-defb-4799-8a02-b9b1d749fc1c","Type":"ContainerDied","Data":"663050e2c910d0c55757c63a08c95aa4b87e2c139eaf681cb47f128d96dfa923"} Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.814073 4763 scope.go:117] "RemoveContainer" containerID="b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.814252 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xgf9g" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.849608 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xgf9g"] Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.852839 4763 scope.go:117] "RemoveContainer" containerID="1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.856731 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xgf9g"] Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.878378 4763 scope.go:117] "RemoveContainer" containerID="957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.930736 4763 scope.go:117] "RemoveContainer" containerID="b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa" Dec 01 10:03:32 crc kubenswrapper[4763]: E1201 10:03:32.931418 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa\": container with ID starting with b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa not found: ID does not exist" containerID="b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.931486 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa"} err="failed to get container status \"b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa\": rpc error: code = NotFound desc = could not find container \"b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa\": container with ID starting with b7581649b33dbf6ccad9a4a93127459f2b2314dac45d2fca613533c0502b2eaa not found: ID does not exist" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.931555 4763 scope.go:117] "RemoveContainer" containerID="1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1" Dec 01 10:03:32 crc kubenswrapper[4763]: E1201 10:03:32.932095 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1\": container with ID starting with 1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1 not found: ID does not exist" containerID="1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.932207 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1"} err="failed to get container status \"1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1\": rpc error: code = NotFound desc = could not find container \"1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1\": container with ID starting with 1bce1b21996406711e68c99644bebf7ca8a0a75266b5d22c50451a2c0f28d9b1 not found: ID does not exist" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.932249 4763 scope.go:117] "RemoveContainer" containerID="957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce" Dec 01 10:03:32 crc kubenswrapper[4763]: E1201 10:03:32.932763 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce\": container with ID starting with 957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce not found: ID does not exist" containerID="957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce" Dec 01 10:03:32 crc kubenswrapper[4763]: I1201 10:03:32.932808 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce"} err="failed to get container status \"957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce\": rpc error: code = NotFound desc = could not find container \"957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce\": container with ID starting with 957875c7493d0c6587e080c19940bedc6a53e8630fe7404aef1128ec62f9b6ce not found: ID does not exist" Dec 01 10:03:33 crc kubenswrapper[4763]: I1201 10:03:33.006752 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" path="/var/lib/kubelet/pods/896ff67c-defb-4799-8a02-b9b1d749fc1c/volumes" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.347374 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gl6g7"] Dec 01 10:04:03 crc kubenswrapper[4763]: E1201 10:04:03.349366 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerName="registry-server" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.349470 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerName="registry-server" Dec 01 10:04:03 crc kubenswrapper[4763]: E1201 10:04:03.349551 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerName="extract-content" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.349609 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerName="extract-content" Dec 01 10:04:03 crc kubenswrapper[4763]: E1201 10:04:03.349669 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerName="extract-utilities" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.349725 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerName="extract-utilities" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.349961 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="896ff67c-defb-4799-8a02-b9b1d749fc1c" containerName="registry-server" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.351292 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.368145 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gl6g7"] Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.403063 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-catalog-content\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.403441 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wtjr\" (UniqueName: \"kubernetes.io/projected/bbc05a00-8782-46b4-a9f2-22e186226404-kube-api-access-9wtjr\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.403662 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-utilities\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.505723 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-utilities\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.505834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-catalog-content\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.506258 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wtjr\" (UniqueName: \"kubernetes.io/projected/bbc05a00-8782-46b4-a9f2-22e186226404-kube-api-access-9wtjr\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.506712 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-utilities\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.506835 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-catalog-content\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.541587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wtjr\" (UniqueName: \"kubernetes.io/projected/bbc05a00-8782-46b4-a9f2-22e186226404-kube-api-access-9wtjr\") pod \"community-operators-gl6g7\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:03 crc kubenswrapper[4763]: I1201 10:04:03.674661 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:04 crc kubenswrapper[4763]: I1201 10:04:04.067213 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gl6g7"] Dec 01 10:04:04 crc kubenswrapper[4763]: I1201 10:04:04.110711 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl6g7" event={"ID":"bbc05a00-8782-46b4-a9f2-22e186226404","Type":"ContainerStarted","Data":"bb3e40acfeaa711155ab23a173f587e2fccc2b55b12ebe9064d7eb6193e3b7e8"} Dec 01 10:04:05 crc kubenswrapper[4763]: I1201 10:04:05.120721 4763 generic.go:334] "Generic (PLEG): container finished" podID="bbc05a00-8782-46b4-a9f2-22e186226404" containerID="31ee1b65ea4ca0de2e1af1c903c1b7991750895e6665e5458ee82b5656b6383d" exitCode=0 Dec 01 10:04:05 crc kubenswrapper[4763]: I1201 10:04:05.120860 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl6g7" event={"ID":"bbc05a00-8782-46b4-a9f2-22e186226404","Type":"ContainerDied","Data":"31ee1b65ea4ca0de2e1af1c903c1b7991750895e6665e5458ee82b5656b6383d"} Dec 01 10:04:07 crc kubenswrapper[4763]: I1201 10:04:07.155266 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl6g7" event={"ID":"bbc05a00-8782-46b4-a9f2-22e186226404","Type":"ContainerStarted","Data":"c296f2f4d8e11072efac09c930974f52d04ca93649b3e805ed5ee7c93ed2fa70"} Dec 01 10:04:08 crc kubenswrapper[4763]: I1201 10:04:08.167450 4763 generic.go:334] "Generic (PLEG): container finished" podID="bbc05a00-8782-46b4-a9f2-22e186226404" containerID="c296f2f4d8e11072efac09c930974f52d04ca93649b3e805ed5ee7c93ed2fa70" exitCode=0 Dec 01 10:04:08 crc kubenswrapper[4763]: I1201 10:04:08.167567 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl6g7" event={"ID":"bbc05a00-8782-46b4-a9f2-22e186226404","Type":"ContainerDied","Data":"c296f2f4d8e11072efac09c930974f52d04ca93649b3e805ed5ee7c93ed2fa70"} Dec 01 10:04:09 crc kubenswrapper[4763]: I1201 10:04:09.184847 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl6g7" event={"ID":"bbc05a00-8782-46b4-a9f2-22e186226404","Type":"ContainerStarted","Data":"ccbe925d806999a44149fa301ed45cd405c8ca304b3e64395d9cfcdd4d50d977"} Dec 01 10:04:09 crc kubenswrapper[4763]: I1201 10:04:09.214287 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gl6g7" podStartSLOduration=2.722038521 podStartE2EDuration="6.214269874s" podCreationTimestamp="2025-12-01 10:04:03 +0000 UTC" firstStartedPulling="2025-12-01 10:04:05.124099652 +0000 UTC m=+2962.392748420" lastFinishedPulling="2025-12-01 10:04:08.616331005 +0000 UTC m=+2965.884979773" observedRunningTime="2025-12-01 10:04:09.211931912 +0000 UTC m=+2966.480580680" watchObservedRunningTime="2025-12-01 10:04:09.214269874 +0000 UTC m=+2966.482918632" Dec 01 10:04:13 crc kubenswrapper[4763]: I1201 10:04:13.674925 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:13 crc kubenswrapper[4763]: I1201 10:04:13.675702 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:13 crc kubenswrapper[4763]: I1201 10:04:13.729296 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:14 crc kubenswrapper[4763]: I1201 10:04:14.272517 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:14 crc kubenswrapper[4763]: I1201 10:04:14.329426 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gl6g7"] Dec 01 10:04:16 crc kubenswrapper[4763]: I1201 10:04:16.252322 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gl6g7" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" containerName="registry-server" containerID="cri-o://ccbe925d806999a44149fa301ed45cd405c8ca304b3e64395d9cfcdd4d50d977" gracePeriod=2 Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.263660 4763 generic.go:334] "Generic (PLEG): container finished" podID="bbc05a00-8782-46b4-a9f2-22e186226404" containerID="ccbe925d806999a44149fa301ed45cd405c8ca304b3e64395d9cfcdd4d50d977" exitCode=0 Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.263696 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl6g7" event={"ID":"bbc05a00-8782-46b4-a9f2-22e186226404","Type":"ContainerDied","Data":"ccbe925d806999a44149fa301ed45cd405c8ca304b3e64395d9cfcdd4d50d977"} Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.427382 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.567376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-utilities\") pod \"bbc05a00-8782-46b4-a9f2-22e186226404\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.568054 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wtjr\" (UniqueName: \"kubernetes.io/projected/bbc05a00-8782-46b4-a9f2-22e186226404-kube-api-access-9wtjr\") pod \"bbc05a00-8782-46b4-a9f2-22e186226404\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.568169 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-catalog-content\") pod \"bbc05a00-8782-46b4-a9f2-22e186226404\" (UID: \"bbc05a00-8782-46b4-a9f2-22e186226404\") " Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.568447 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-utilities" (OuterVolumeSpecName: "utilities") pod "bbc05a00-8782-46b4-a9f2-22e186226404" (UID: "bbc05a00-8782-46b4-a9f2-22e186226404"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.568681 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.577375 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc05a00-8782-46b4-a9f2-22e186226404-kube-api-access-9wtjr" (OuterVolumeSpecName: "kube-api-access-9wtjr") pod "bbc05a00-8782-46b4-a9f2-22e186226404" (UID: "bbc05a00-8782-46b4-a9f2-22e186226404"). InnerVolumeSpecName "kube-api-access-9wtjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.622502 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbc05a00-8782-46b4-a9f2-22e186226404" (UID: "bbc05a00-8782-46b4-a9f2-22e186226404"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.669975 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wtjr\" (UniqueName: \"kubernetes.io/projected/bbc05a00-8782-46b4-a9f2-22e186226404-kube-api-access-9wtjr\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:17 crc kubenswrapper[4763]: I1201 10:04:17.670005 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbc05a00-8782-46b4-a9f2-22e186226404-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:04:18 crc kubenswrapper[4763]: I1201 10:04:18.273721 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl6g7" event={"ID":"bbc05a00-8782-46b4-a9f2-22e186226404","Type":"ContainerDied","Data":"bb3e40acfeaa711155ab23a173f587e2fccc2b55b12ebe9064d7eb6193e3b7e8"} Dec 01 10:04:18 crc kubenswrapper[4763]: I1201 10:04:18.274585 4763 scope.go:117] "RemoveContainer" containerID="ccbe925d806999a44149fa301ed45cd405c8ca304b3e64395d9cfcdd4d50d977" Dec 01 10:04:18 crc kubenswrapper[4763]: I1201 10:04:18.273784 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl6g7" Dec 01 10:04:18 crc kubenswrapper[4763]: I1201 10:04:18.299348 4763 scope.go:117] "RemoveContainer" containerID="c296f2f4d8e11072efac09c930974f52d04ca93649b3e805ed5ee7c93ed2fa70" Dec 01 10:04:18 crc kubenswrapper[4763]: I1201 10:04:18.317604 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gl6g7"] Dec 01 10:04:18 crc kubenswrapper[4763]: I1201 10:04:18.338351 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gl6g7"] Dec 01 10:04:18 crc kubenswrapper[4763]: I1201 10:04:18.340189 4763 scope.go:117] "RemoveContainer" containerID="31ee1b65ea4ca0de2e1af1c903c1b7991750895e6665e5458ee82b5656b6383d" Dec 01 10:04:19 crc kubenswrapper[4763]: I1201 10:04:19.004847 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" path="/var/lib/kubelet/pods/bbc05a00-8782-46b4-a9f2-22e186226404/volumes" Dec 01 10:05:33 crc kubenswrapper[4763]: I1201 10:05:33.929048 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:05:33 crc kubenswrapper[4763]: I1201 10:05:33.929605 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:06:03 crc kubenswrapper[4763]: I1201 10:06:03.929543 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:06:03 crc kubenswrapper[4763]: I1201 10:06:03.930175 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:06:14 crc kubenswrapper[4763]: I1201 10:06:14.851843 4763 generic.go:334] "Generic (PLEG): container finished" podID="a458d267-3663-4b9e-baa3-c3711a334c80" containerID="7bb8111fa53c249d2b9c3bcac349133c67f7358ff107c632748e0838c52032f4" exitCode=0 Dec 01 10:06:14 crc kubenswrapper[4763]: I1201 10:06:14.851951 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" event={"ID":"a458d267-3663-4b9e-baa3-c3711a334c80","Type":"ContainerDied","Data":"7bb8111fa53c249d2b9c3bcac349133c67f7358ff107c632748e0838c52032f4"} Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.271690 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.473075 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwkrp\" (UniqueName: \"kubernetes.io/projected/a458d267-3663-4b9e-baa3-c3711a334c80-kube-api-access-wwkrp\") pod \"a458d267-3663-4b9e-baa3-c3711a334c80\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.473422 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-combined-ca-bundle\") pod \"a458d267-3663-4b9e-baa3-c3711a334c80\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.473617 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-secret-0\") pod \"a458d267-3663-4b9e-baa3-c3711a334c80\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.473731 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-inventory\") pod \"a458d267-3663-4b9e-baa3-c3711a334c80\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.473847 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ssh-key\") pod \"a458d267-3663-4b9e-baa3-c3711a334c80\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.474014 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ceph\") pod \"a458d267-3663-4b9e-baa3-c3711a334c80\" (UID: \"a458d267-3663-4b9e-baa3-c3711a334c80\") " Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.479306 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ceph" (OuterVolumeSpecName: "ceph") pod "a458d267-3663-4b9e-baa3-c3711a334c80" (UID: "a458d267-3663-4b9e-baa3-c3711a334c80"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.479318 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a458d267-3663-4b9e-baa3-c3711a334c80" (UID: "a458d267-3663-4b9e-baa3-c3711a334c80"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.479895 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a458d267-3663-4b9e-baa3-c3711a334c80-kube-api-access-wwkrp" (OuterVolumeSpecName: "kube-api-access-wwkrp") pod "a458d267-3663-4b9e-baa3-c3711a334c80" (UID: "a458d267-3663-4b9e-baa3-c3711a334c80"). InnerVolumeSpecName "kube-api-access-wwkrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.503433 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a458d267-3663-4b9e-baa3-c3711a334c80" (UID: "a458d267-3663-4b9e-baa3-c3711a334c80"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.503885 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a458d267-3663-4b9e-baa3-c3711a334c80" (UID: "a458d267-3663-4b9e-baa3-c3711a334c80"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.511477 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-inventory" (OuterVolumeSpecName: "inventory") pod "a458d267-3663-4b9e-baa3-c3711a334c80" (UID: "a458d267-3663-4b9e-baa3-c3711a334c80"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.576994 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.577030 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwkrp\" (UniqueName: \"kubernetes.io/projected/a458d267-3663-4b9e-baa3-c3711a334c80-kube-api-access-wwkrp\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.577045 4763 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.577058 4763 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.577071 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.577081 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a458d267-3663-4b9e-baa3-c3711a334c80-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.872429 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" event={"ID":"a458d267-3663-4b9e-baa3-c3711a334c80","Type":"ContainerDied","Data":"717d1d8b1849fe77a83c5cc2ca85675b718496f98dda7523a74204fa1800de43"} Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.872505 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="717d1d8b1849fe77a83c5cc2ca85675b718496f98dda7523a74204fa1800de43" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.872578 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.992341 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk"] Dec 01 10:06:16 crc kubenswrapper[4763]: E1201 10:06:16.992762 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a458d267-3663-4b9e-baa3-c3711a334c80" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.992781 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a458d267-3663-4b9e-baa3-c3711a334c80" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 10:06:16 crc kubenswrapper[4763]: E1201 10:06:16.992800 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" containerName="registry-server" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.992807 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" containerName="registry-server" Dec 01 10:06:16 crc kubenswrapper[4763]: E1201 10:06:16.992831 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" containerName="extract-utilities" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.992839 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" containerName="extract-utilities" Dec 01 10:06:16 crc kubenswrapper[4763]: E1201 10:06:16.992856 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" containerName="extract-content" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.992863 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" containerName="extract-content" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.993046 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a458d267-3663-4b9e-baa3-c3711a334c80" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.993063 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc05a00-8782-46b4-a9f2-22e186226404" containerName="registry-server" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.993901 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:16 crc kubenswrapper[4763]: I1201 10:06:16.999100 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.000361 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.000622 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.000798 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.000958 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.001239 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.001399 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.001579 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rl28q" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.004331 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.012059 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk"] Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089280 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nnbp\" (UniqueName: \"kubernetes.io/projected/43b000e9-21e9-47f9-8bc7-a93a8747159e-kube-api-access-6nnbp\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089368 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089393 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089413 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089440 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089499 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089521 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089601 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089714 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089736 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.089754 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191568 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191596 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191648 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nnbp\" (UniqueName: \"kubernetes.io/projected/43b000e9-21e9-47f9-8bc7-a93a8747159e-kube-api-access-6nnbp\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191684 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191707 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191729 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191752 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191780 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191798 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.191832 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.193203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.193514 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.195389 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.196246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.196715 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.197151 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.197684 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.198180 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.198936 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.199341 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.215094 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nnbp\" (UniqueName: \"kubernetes.io/projected/43b000e9-21e9-47f9-8bc7-a93a8747159e-kube-api-access-6nnbp\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.346884 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.872302 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk"] Dec 01 10:06:17 crc kubenswrapper[4763]: I1201 10:06:17.880737 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:06:18 crc kubenswrapper[4763]: I1201 10:06:18.899740 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" event={"ID":"43b000e9-21e9-47f9-8bc7-a93a8747159e","Type":"ContainerStarted","Data":"86c8429555e3e115e85e600b04c140931daa08ce8765195c0aa6fe7a47c70f3b"} Dec 01 10:06:18 crc kubenswrapper[4763]: I1201 10:06:18.900073 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" event={"ID":"43b000e9-21e9-47f9-8bc7-a93a8747159e","Type":"ContainerStarted","Data":"78646096cc6d6530c1664abf28ec0c4a413e4c53fae535c3b503cc59b3a1e9d5"} Dec 01 10:06:18 crc kubenswrapper[4763]: I1201 10:06:18.915490 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" podStartSLOduration=2.168618856 podStartE2EDuration="2.915473876s" podCreationTimestamp="2025-12-01 10:06:16 +0000 UTC" firstStartedPulling="2025-12-01 10:06:17.880487885 +0000 UTC m=+3095.149136653" lastFinishedPulling="2025-12-01 10:06:18.627342895 +0000 UTC m=+3095.895991673" observedRunningTime="2025-12-01 10:06:18.913708239 +0000 UTC m=+3096.182357007" watchObservedRunningTime="2025-12-01 10:06:18.915473876 +0000 UTC m=+3096.184122644" Dec 01 10:06:33 crc kubenswrapper[4763]: I1201 10:06:33.929589 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:06:33 crc kubenswrapper[4763]: I1201 10:06:33.931416 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:06:33 crc kubenswrapper[4763]: I1201 10:06:33.931637 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 10:06:33 crc kubenswrapper[4763]: I1201 10:06:33.932954 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:06:33 crc kubenswrapper[4763]: I1201 10:06:33.933149 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" gracePeriod=600 Dec 01 10:06:34 crc kubenswrapper[4763]: E1201 10:06:34.055980 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:06:35 crc kubenswrapper[4763]: I1201 10:06:35.021376 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" exitCode=0 Dec 01 10:06:35 crc kubenswrapper[4763]: I1201 10:06:35.021434 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb"} Dec 01 10:06:35 crc kubenswrapper[4763]: I1201 10:06:35.021565 4763 scope.go:117] "RemoveContainer" containerID="f702c11f8b0131f090364f487a6cf11628da1960227361edee9a7525bb67ff62" Dec 01 10:06:35 crc kubenswrapper[4763]: I1201 10:06:35.022140 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:06:35 crc kubenswrapper[4763]: E1201 10:06:35.022492 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:06:48 crc kubenswrapper[4763]: I1201 10:06:48.994386 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:06:48 crc kubenswrapper[4763]: E1201 10:06:48.994975 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:07:01 crc kubenswrapper[4763]: I1201 10:07:01.994365 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:07:01 crc kubenswrapper[4763]: E1201 10:07:01.995233 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:07:14 crc kubenswrapper[4763]: I1201 10:07:14.994338 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:07:14 crc kubenswrapper[4763]: E1201 10:07:14.995153 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:07:27 crc kubenswrapper[4763]: I1201 10:07:27.994824 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:07:27 crc kubenswrapper[4763]: E1201 10:07:27.996628 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:07:41 crc kubenswrapper[4763]: I1201 10:07:41.994388 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:07:41 crc kubenswrapper[4763]: E1201 10:07:41.995111 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:07:53 crc kubenswrapper[4763]: I1201 10:07:53.001601 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:07:53 crc kubenswrapper[4763]: E1201 10:07:53.002440 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:08:04 crc kubenswrapper[4763]: I1201 10:08:04.994511 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:08:04 crc kubenswrapper[4763]: E1201 10:08:04.996670 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:08:18 crc kubenswrapper[4763]: I1201 10:08:18.994095 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:08:18 crc kubenswrapper[4763]: E1201 10:08:18.995085 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:08:29 crc kubenswrapper[4763]: I1201 10:08:29.994957 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:08:29 crc kubenswrapper[4763]: E1201 10:08:29.995773 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:08:43 crc kubenswrapper[4763]: I1201 10:08:43.000097 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:08:43 crc kubenswrapper[4763]: E1201 10:08:43.000730 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:08:53 crc kubenswrapper[4763]: I1201 10:08:53.994805 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:08:53 crc kubenswrapper[4763]: E1201 10:08:53.996015 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:09:06 crc kubenswrapper[4763]: I1201 10:09:06.994869 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:09:06 crc kubenswrapper[4763]: E1201 10:09:06.997084 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:09:17 crc kubenswrapper[4763]: I1201 10:09:17.994368 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:09:17 crc kubenswrapper[4763]: E1201 10:09:17.999227 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:09:31 crc kubenswrapper[4763]: I1201 10:09:31.994006 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:09:31 crc kubenswrapper[4763]: E1201 10:09:31.994922 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:09:43 crc kubenswrapper[4763]: I1201 10:09:43.006498 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:09:43 crc kubenswrapper[4763]: E1201 10:09:43.010077 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:09:45 crc kubenswrapper[4763]: I1201 10:09:45.974001 4763 generic.go:334] "Generic (PLEG): container finished" podID="43b000e9-21e9-47f9-8bc7-a93a8747159e" containerID="86c8429555e3e115e85e600b04c140931daa08ce8765195c0aa6fe7a47c70f3b" exitCode=0 Dec 01 10:09:45 crc kubenswrapper[4763]: I1201 10:09:45.974112 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" event={"ID":"43b000e9-21e9-47f9-8bc7-a93a8747159e","Type":"ContainerDied","Data":"86c8429555e3e115e85e600b04c140931daa08ce8765195c0aa6fe7a47c70f3b"} Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.460945 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.624690 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-1\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.624732 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nnbp\" (UniqueName: \"kubernetes.io/projected/43b000e9-21e9-47f9-8bc7-a93a8747159e-kube-api-access-6nnbp\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.624763 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-inventory\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.624834 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-extra-config-0\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.624869 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.624987 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-custom-ceph-combined-ca-bundle\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.625016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-0\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.625069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ssh-key\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.625290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-0\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.625345 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.625519 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph-nova-0\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.633912 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b000e9-21e9-47f9-8bc7-a93a8747159e-kube-api-access-6nnbp" (OuterVolumeSpecName: "kube-api-access-6nnbp") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "kube-api-access-6nnbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.635712 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.645055 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph" (OuterVolumeSpecName: "ceph") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.663541 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.678124 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.689009 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.704208 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.732981 4763 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.733024 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.733036 4763 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.733047 4763 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.733055 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nnbp\" (UniqueName: \"kubernetes.io/projected/43b000e9-21e9-47f9-8bc7-a93a8747159e-kube-api-access-6nnbp\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.733063 4763 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.733072 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.762644 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: E1201 10:09:47.764775 4763 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1 podName:43b000e9-21e9-47f9-8bc7-a93a8747159e nodeName:}" failed. No retries permitted until 2025-12-01 10:09:48.264751358 +0000 UTC m=+3305.533400126 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "nova-cell1-compute-config-1" (UniqueName: "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e") : error deleting /var/lib/kubelet/pods/43b000e9-21e9-47f9-8bc7-a93a8747159e/volume-subpaths: remove /var/lib/kubelet/pods/43b000e9-21e9-47f9-8bc7-a93a8747159e/volume-subpaths: no such file or directory Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.767144 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-inventory" (OuterVolumeSpecName: "inventory") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.770583 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.834899 4763 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.835171 4763 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-inventory\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.835252 4763 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.994546 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" event={"ID":"43b000e9-21e9-47f9-8bc7-a93a8747159e","Type":"ContainerDied","Data":"78646096cc6d6530c1664abf28ec0c4a413e4c53fae535c3b503cc59b3a1e9d5"} Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.994595 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78646096cc6d6530c1664abf28ec0c4a413e4c53fae535c3b503cc59b3a1e9d5" Dec 01 10:09:47 crc kubenswrapper[4763]: I1201 10:09:47.994626 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk" Dec 01 10:09:48 crc kubenswrapper[4763]: I1201 10:09:48.343364 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1\") pod \"43b000e9-21e9-47f9-8bc7-a93a8747159e\" (UID: \"43b000e9-21e9-47f9-8bc7-a93a8747159e\") " Dec 01 10:09:48 crc kubenswrapper[4763]: I1201 10:09:48.346369 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "43b000e9-21e9-47f9-8bc7-a93a8747159e" (UID: "43b000e9-21e9-47f9-8bc7-a93a8747159e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:09:48 crc kubenswrapper[4763]: I1201 10:09:48.447601 4763 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/43b000e9-21e9-47f9-8bc7-a93a8747159e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 01 10:09:56 crc kubenswrapper[4763]: I1201 10:09:56.994438 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:09:56 crc kubenswrapper[4763]: E1201 10:09:56.995239 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.865548 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 10:10:03 crc kubenswrapper[4763]: E1201 10:10:03.867164 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b000e9-21e9-47f9-8bc7-a93a8747159e" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.867337 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b000e9-21e9-47f9-8bc7-a93a8747159e" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.867613 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b000e9-21e9-47f9-8bc7-a93a8747159e" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.868580 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.875422 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.875474 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.885054 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.948540 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.953255 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.956176 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 01 10:10:03 crc kubenswrapper[4763]: I1201 10:10:03.961007 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.007525 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.008983 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-sys\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.009136 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.009242 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-run\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.009330 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.009422 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.009737 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1a72cdbd-892b-459d-86e2-0dde31be5e39-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.009825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.009895 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.010015 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x4pz\" (UniqueName: \"kubernetes.io/projected/1a72cdbd-892b-459d-86e2-0dde31be5e39-kube-api-access-2x4pz\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.010080 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.010127 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.010149 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.010209 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.010232 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-dev\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.010294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.111863 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.111930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.111957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112005 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112026 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-dev\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112032 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112068 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112083 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112164 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60bdb953-6735-4247-8287-16dbf4187c03-ceph\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112187 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112212 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112254 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112299 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112360 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-sys\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112435 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-sys\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112485 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-scripts\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112508 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-run\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112590 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112610 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112643 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-lib-modules\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112666 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjgqx\" (UniqueName: \"kubernetes.io/projected/60bdb953-6735-4247-8287-16dbf4187c03-kube-api-access-mjgqx\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112735 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-config-data-custom\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112788 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1a72cdbd-892b-459d-86e2-0dde31be5e39-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112795 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-dev\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112815 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-dev\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112843 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112903 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-run\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112921 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112946 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112970 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-config-data\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.112990 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.113016 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.113056 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.113087 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-nvme\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.113110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x4pz\" (UniqueName: \"kubernetes.io/projected/1a72cdbd-892b-459d-86e2-0dde31be5e39-kube-api-access-2x4pz\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.113129 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.113674 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-sys\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.113755 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-run\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.113899 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.114000 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.114409 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/1a72cdbd-892b-459d-86e2-0dde31be5e39-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.124120 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.126196 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.127479 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.146068 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1a72cdbd-892b-459d-86e2-0dde31be5e39-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.159292 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x4pz\" (UniqueName: \"kubernetes.io/projected/1a72cdbd-892b-459d-86e2-0dde31be5e39-kube-api-access-2x4pz\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.160994 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1a72cdbd-892b-459d-86e2-0dde31be5e39-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"1a72cdbd-892b-459d-86e2-0dde31be5e39\") " pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.190362 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.215777 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60bdb953-6735-4247-8287-16dbf4187c03-ceph\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216063 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216195 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216322 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-sys\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216560 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-scripts\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216725 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-lib-modules\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216885 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjgqx\" (UniqueName: \"kubernetes.io/projected/60bdb953-6735-4247-8287-16dbf4187c03-kube-api-access-mjgqx\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217013 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-config-data-custom\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-lib-modules\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216578 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216644 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-sys\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217302 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-dev\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217415 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-run\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217551 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-config-data\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217766 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217881 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.217987 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.218097 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-nvme\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.218285 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-nvme\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.218726 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.218873 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.218985 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-run\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.219091 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-dev\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.216619 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.219218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60bdb953-6735-4247-8287-16dbf4187c03-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.223799 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-scripts\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.225276 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/60bdb953-6735-4247-8287-16dbf4187c03-ceph\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.225694 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-config-data-custom\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.227606 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-config-data\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.246123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60bdb953-6735-4247-8287-16dbf4187c03-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.265934 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjgqx\" (UniqueName: \"kubernetes.io/projected/60bdb953-6735-4247-8287-16dbf4187c03-kube-api-access-mjgqx\") pod \"cinder-backup-0\" (UID: \"60bdb953-6735-4247-8287-16dbf4187c03\") " pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.273264 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.650250 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-ldpps"] Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.651998 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-ldpps" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.667554 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-ldpps"] Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.781184 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.797846 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.808961 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.809219 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.809359 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xlzd8" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.811914 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.838221 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7rp4\" (UniqueName: \"kubernetes.io/projected/017f4f6a-809c-4019-9645-0e24cb5f2827-kube-api-access-w7rp4\") pod \"manila-db-create-ldpps\" (UID: \"017f4f6a-809c-4019-9645-0e24cb5f2827\") " pod="openstack/manila-db-create-ldpps" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.838294 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/017f4f6a-809c-4019-9645-0e24cb5f2827-operator-scripts\") pod \"manila-db-create-ldpps\" (UID: \"017f4f6a-809c-4019-9645-0e24cb5f2827\") " pod="openstack/manila-db-create-ldpps" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.843470 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.924198 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.943488 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.961387 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962397 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7rp4\" (UniqueName: \"kubernetes.io/projected/017f4f6a-809c-4019-9645-0e24cb5f2827-kube-api-access-w7rp4\") pod \"manila-db-create-ldpps\" (UID: \"017f4f6a-809c-4019-9645-0e24cb5f2827\") " pod="openstack/manila-db-create-ldpps" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962671 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962725 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962777 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5swx\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-kube-api-access-l5swx\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962836 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962878 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-logs\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962910 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962947 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/017f4f6a-809c-4019-9645-0e24cb5f2827-operator-scripts\") pod \"manila-db-create-ldpps\" (UID: \"017f4f6a-809c-4019-9645-0e24cb5f2827\") " pod="openstack/manila-db-create-ldpps" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.962978 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.963126 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-ceph\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.964428 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/017f4f6a-809c-4019-9645-0e24cb5f2827-operator-scripts\") pod \"manila-db-create-ldpps\" (UID: \"017f4f6a-809c-4019-9645-0e24cb5f2827\") " pod="openstack/manila-db-create-ldpps" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.969830 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 10:10:04 crc kubenswrapper[4763]: I1201 10:10:04.971288 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.084503 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7rp4\" (UniqueName: \"kubernetes.io/projected/017f4f6a-809c-4019-9645-0e24cb5f2827-kube-api-access-w7rp4\") pod \"manila-db-create-ldpps\" (UID: \"017f4f6a-809c-4019-9645-0e24cb5f2827\") " pod="openstack/manila-db-create-ldpps" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.191381 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.204202 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.205939 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206022 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206055 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7dj\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-kube-api-access-qx7dj\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206119 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206184 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206248 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206280 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206325 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206378 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206408 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206434 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5swx\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-kube-api-access-l5swx\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.206444 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.211621 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.211742 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-logs\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.211778 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.211803 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.211872 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.211950 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-ceph\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.212707 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-logs\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.227990 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cc5dcd8bf-2zv5x"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.232924 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.249213 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.249385 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.249564 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-qpf87" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.250147 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.254091 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.275740 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:05 crc kubenswrapper[4763]: E1201 10:10:05.276583 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-qx7dj logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="5e9ed243-5568-4410-800a-7a5f17873353" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.283483 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cc5dcd8bf-2zv5x"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.283937 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-ldpps" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.289719 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5swx\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-kube-api-access-l5swx\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.304930 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-3e78-account-create-update-fdhhc"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.312840 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-logs\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.317247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-scripts\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.317356 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.317477 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-config-data\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.317632 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpph5\" (UniqueName: \"kubernetes.io/projected/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-kube-api-access-zpph5\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.317708 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.317866 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx7dj\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-kube-api-access-qx7dj\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.317957 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.318039 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.318110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.318195 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.318275 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.318417 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-horizon-secret-key\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.318518 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.318705 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.319159 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-logs\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.319869 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.322381 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.322540 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3e78-account-create-update-fdhhc"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.340312 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.346307 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.346960 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.355695 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.373203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.373646 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-ceph\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.374502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.386188 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-scripts\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.386612 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-config-data\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.387582 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.393759 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.400865 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.435469 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-logs\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.435561 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-scripts\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.435653 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-config-data\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.435734 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19909ecc-64c8-4808-9447-c0ade391a43b-operator-scripts\") pod \"manila-3e78-account-create-update-fdhhc\" (UID: \"19909ecc-64c8-4808-9447-c0ade391a43b\") " pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.435808 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpph5\" (UniqueName: \"kubernetes.io/projected/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-kube-api-access-zpph5\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.435853 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/19909ecc-64c8-4808-9447-c0ade391a43b-kube-api-access-j2b7z\") pod \"manila-3e78-account-create-update-fdhhc\" (UID: \"19909ecc-64c8-4808-9447-c0ade391a43b\") " pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.436034 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-horizon-secret-key\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.437274 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-logs\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.450632 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx7dj\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-kube-api-access-qx7dj\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.452627 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.456500 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-scripts\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.458296 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-config-data\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.464522 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-horizon-secret-key\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.479700 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpph5\" (UniqueName: \"kubernetes.io/projected/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-kube-api-access-zpph5\") pod \"horizon-5cc5dcd8bf-2zv5x\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.488699 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.537833 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19909ecc-64c8-4808-9447-c0ade391a43b-operator-scripts\") pod \"manila-3e78-account-create-update-fdhhc\" (UID: \"19909ecc-64c8-4808-9447-c0ade391a43b\") " pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.537930 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/19909ecc-64c8-4808-9447-c0ade391a43b-kube-api-access-j2b7z\") pod \"manila-3e78-account-create-update-fdhhc\" (UID: \"19909ecc-64c8-4808-9447-c0ade391a43b\") " pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.539587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19909ecc-64c8-4808-9447-c0ade391a43b-operator-scripts\") pod \"manila-3e78-account-create-update-fdhhc\" (UID: \"19909ecc-64c8-4808-9447-c0ade391a43b\") " pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.545375 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.565818 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77cc696d55-zpts8"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.567381 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.576835 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/19909ecc-64c8-4808-9447-c0ade391a43b-kube-api-access-j2b7z\") pod \"manila-3e78-account-create-update-fdhhc\" (UID: \"19909ecc-64c8-4808-9447-c0ade391a43b\") " pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.608998 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77cc696d55-zpts8"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.635588 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.639652 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3d03700b-22aa-4d49-a92e-c5ca2baf9354-horizon-secret-key\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.639801 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-config-data\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.639850 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmr6j\" (UniqueName: \"kubernetes.io/projected/3d03700b-22aa-4d49-a92e-c5ca2baf9354-kube-api-access-vmr6j\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.639889 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-scripts\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.639921 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d03700b-22aa-4d49-a92e-c5ca2baf9354-logs\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.744037 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3d03700b-22aa-4d49-a92e-c5ca2baf9354-horizon-secret-key\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.744143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-config-data\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.744177 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmr6j\" (UniqueName: \"kubernetes.io/projected/3d03700b-22aa-4d49-a92e-c5ca2baf9354-kube-api-access-vmr6j\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.744206 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-scripts\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.744233 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d03700b-22aa-4d49-a92e-c5ca2baf9354-logs\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.744699 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d03700b-22aa-4d49-a92e-c5ca2baf9354-logs\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.746843 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-config-data\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.747191 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-scripts\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.752112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3d03700b-22aa-4d49-a92e-c5ca2baf9354-horizon-secret-key\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.780493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.789012 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmr6j\" (UniqueName: \"kubernetes.io/projected/3d03700b-22aa-4d49-a92e-c5ca2baf9354-kube-api-access-vmr6j\") pod \"horizon-77cc696d55-zpts8\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.790274 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:05 crc kubenswrapper[4763]: I1201 10:10:05.904445 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.251315 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.251565 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"1a72cdbd-892b-459d-86e2-0dde31be5e39","Type":"ContainerStarted","Data":"475c33b130745dcf2b958435972c990331344319022e80de70a841819323348a"} Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.302791 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.344983 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-ldpps"] Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.373290 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx7dj\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-kube-api-access-qx7dj\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.373720 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-logs\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.373996 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-combined-ca-bundle\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.374096 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.374281 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-scripts\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.374618 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-internal-tls-certs\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.374835 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-httpd-run\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.374933 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-ceph\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.375149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-config-data\") pod \"5e9ed243-5568-4410-800a-7a5f17873353\" (UID: \"5e9ed243-5568-4410-800a-7a5f17873353\") " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.374695 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-logs" (OuterVolumeSpecName: "logs") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.380612 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.382023 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.382463 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-kube-api-access-qx7dj" (OuterVolumeSpecName: "kube-api-access-qx7dj") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "kube-api-access-qx7dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.385634 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-scripts" (OuterVolumeSpecName: "scripts") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.389399 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.395094 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-config-data" (OuterVolumeSpecName: "config-data") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.395618 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-ceph" (OuterVolumeSpecName: "ceph") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.405529 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.409842 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e9ed243-5568-4410-800a-7a5f17873353" (UID: "5e9ed243-5568-4410-800a-7a5f17873353"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.480673 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.483938 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.484072 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx7dj\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-kube-api-access-qx7dj\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.484162 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.484285 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.484377 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.484595 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e9ed243-5568-4410-800a-7a5f17873353-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.485336 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5e9ed243-5568-4410-800a-7a5f17873353-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.485369 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e9ed243-5568-4410-800a-7a5f17873353-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.521523 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.575385 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cc5dcd8bf-2zv5x"] Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.589413 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:06 crc kubenswrapper[4763]: W1201 10:10:06.612326 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod799573bb_9ed0_43fc_9fc8_c5c5c2634e71.slice/crio-fa49d1c837fdac5d4494c03fe2aaae02f0d901f7715b2a098c63b0836d590eed WatchSource:0}: Error finding container fa49d1c837fdac5d4494c03fe2aaae02f0d901f7715b2a098c63b0836d590eed: Status 404 returned error can't find the container with id fa49d1c837fdac5d4494c03fe2aaae02f0d901f7715b2a098c63b0836d590eed Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.685783 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:06 crc kubenswrapper[4763]: W1201 10:10:06.924493 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19909ecc_64c8_4808_9447_c0ade391a43b.slice/crio-a96b397ca15ec412376672412b5e9888ee6cbca5a7421a58b0be9dba2f779a0f WatchSource:0}: Error finding container a96b397ca15ec412376672412b5e9888ee6cbca5a7421a58b0be9dba2f779a0f: Status 404 returned error can't find the container with id a96b397ca15ec412376672412b5e9888ee6cbca5a7421a58b0be9dba2f779a0f Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.936135 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77cc696d55-zpts8"] Dec 01 10:10:06 crc kubenswrapper[4763]: I1201 10:10:06.967933 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3e78-account-create-update-fdhhc"] Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.267803 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"60bdb953-6735-4247-8287-16dbf4187c03","Type":"ContainerStarted","Data":"789beba166c9d8ad3778256f32d13859a3c46f60b11fa1fd3207ddc74ebfbf71"} Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.270324 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cc5dcd8bf-2zv5x" event={"ID":"799573bb-9ed0-43fc-9fc8-c5c5c2634e71","Type":"ContainerStarted","Data":"fa49d1c837fdac5d4494c03fe2aaae02f0d901f7715b2a098c63b0836d590eed"} Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.273505 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-ldpps" event={"ID":"017f4f6a-809c-4019-9645-0e24cb5f2827","Type":"ContainerStarted","Data":"aab2da22e8c0200e46dc23f6d0d1138c5a10cbed00d157d231a6c74d6f067e18"} Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.273554 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-ldpps" event={"ID":"017f4f6a-809c-4019-9645-0e24cb5f2827","Type":"ContainerStarted","Data":"c85f1c7ecb691113552d764deae873088295f30bdb59a80cd25acfc79cf006dc"} Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.297845 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3e78-account-create-update-fdhhc" event={"ID":"19909ecc-64c8-4808-9447-c0ade391a43b","Type":"ContainerStarted","Data":"a96b397ca15ec412376672412b5e9888ee6cbca5a7421a58b0be9dba2f779a0f"} Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.305397 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-ldpps" podStartSLOduration=3.305376647 podStartE2EDuration="3.305376647s" podCreationTimestamp="2025-12-01 10:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:10:07.293139581 +0000 UTC m=+3324.561788349" watchObservedRunningTime="2025-12-01 10:10:07.305376647 +0000 UTC m=+3324.574025415" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.307681 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c926de0-d99a-400f-8217-776bea2ca166","Type":"ContainerStarted","Data":"5b10f9bacd0a2005d4f2cae156a8ff0c0a32f73146640b8fbba57fbf4b1eec41"} Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.321551 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.322735 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cc696d55-zpts8" event={"ID":"3d03700b-22aa-4d49-a92e-c5ca2baf9354","Type":"ContainerStarted","Data":"c78c39099d0557d06e89f7dd29b561800769589bcf03a6cf8fa185ba050a23ce"} Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.410892 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.441915 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.468891 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.470723 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.474663 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.478323 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.499242 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532143 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532198 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532267 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532307 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-584bt\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-kube-api-access-584bt\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532383 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532415 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532726 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532854 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-logs\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.532890 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.634544 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.634912 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.635022 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-logs\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.635056 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.635118 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.635142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.635261 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.635317 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-584bt\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-kube-api-access-584bt\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.635378 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.635900 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.638745 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-logs\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.639008 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.654791 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.655145 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.655194 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.674502 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.676771 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-584bt\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-kube-api-access-584bt\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.677859 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.746372 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:07 crc kubenswrapper[4763]: I1201 10:10:07.940533 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:08 crc kubenswrapper[4763]: I1201 10:10:08.359147 4763 generic.go:334] "Generic (PLEG): container finished" podID="017f4f6a-809c-4019-9645-0e24cb5f2827" containerID="aab2da22e8c0200e46dc23f6d0d1138c5a10cbed00d157d231a6c74d6f067e18" exitCode=0 Dec 01 10:10:08 crc kubenswrapper[4763]: I1201 10:10:08.359963 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-ldpps" event={"ID":"017f4f6a-809c-4019-9645-0e24cb5f2827","Type":"ContainerDied","Data":"aab2da22e8c0200e46dc23f6d0d1138c5a10cbed00d157d231a6c74d6f067e18"} Dec 01 10:10:08 crc kubenswrapper[4763]: I1201 10:10:08.362368 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3e78-account-create-update-fdhhc" event={"ID":"19909ecc-64c8-4808-9447-c0ade391a43b","Type":"ContainerStarted","Data":"e517b181aa3ffdcd0ef26bd6858d4db9e4b09a586699b4d2f7537ea5e49ee60e"} Dec 01 10:10:08 crc kubenswrapper[4763]: I1201 10:10:08.364753 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"1a72cdbd-892b-459d-86e2-0dde31be5e39","Type":"ContainerStarted","Data":"c4d8aa664ae1600b59b4de7d28d38348c5cba03f1a73b88c0ee945eb448eae53"} Dec 01 10:10:08 crc kubenswrapper[4763]: I1201 10:10:08.366677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c926de0-d99a-400f-8217-776bea2ca166","Type":"ContainerStarted","Data":"1f21c0373285229e889c09878bc0460ce75704b7811ef595e98d68180a2e858b"} Dec 01 10:10:08 crc kubenswrapper[4763]: I1201 10:10:08.406780 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-3e78-account-create-update-fdhhc" podStartSLOduration=3.406759182 podStartE2EDuration="3.406759182s" podCreationTimestamp="2025-12-01 10:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:10:08.401388554 +0000 UTC m=+3325.670037322" watchObservedRunningTime="2025-12-01 10:10:08.406759182 +0000 UTC m=+3325.675407950" Dec 01 10:10:08 crc kubenswrapper[4763]: I1201 10:10:08.961214 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cc5dcd8bf-2zv5x"] Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.048052 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9ed243-5568-4410-800a-7a5f17873353" path="/var/lib/kubelet/pods/5e9ed243-5568-4410-800a-7a5f17873353/volumes" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.048609 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79cd4dd7b6-cwmzr"] Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.067964 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.068131 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.083541 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79cd4dd7b6-cwmzr"] Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.095745 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-combined-ca-bundle\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.095834 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-secret-key\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.095871 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.095919 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-scripts\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.095953 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-config-data\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.095977 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0343e2-73aa-4c17-8f7c-9835bdba9977-logs\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.096015 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcjb8\" (UniqueName: \"kubernetes.io/projected/aa0343e2-73aa-4c17-8f7c-9835bdba9977-kube-api-access-fcjb8\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.096126 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-tls-certs\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.248548 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-scripts\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.248887 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-config-data\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.248920 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0343e2-73aa-4c17-8f7c-9835bdba9977-logs\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.248960 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcjb8\" (UniqueName: \"kubernetes.io/projected/aa0343e2-73aa-4c17-8f7c-9835bdba9977-kube-api-access-fcjb8\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.249055 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-tls-certs\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.249133 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-combined-ca-bundle\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.249181 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-secret-key\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.258946 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0343e2-73aa-4c17-8f7c-9835bdba9977-logs\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.259259 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-scripts\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.259961 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-config-data\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.260006 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.272015 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-tls-certs\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.272028 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-secret-key\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.293969 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-combined-ca-bundle\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.325534 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcjb8\" (UniqueName: \"kubernetes.io/projected/aa0343e2-73aa-4c17-8f7c-9835bdba9977-kube-api-access-fcjb8\") pod \"horizon-79cd4dd7b6-cwmzr\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.345356 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77cc696d55-zpts8"] Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.369340 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c998bbd96-rw26q"] Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.382567 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.403910 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.414525 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c998bbd96-rw26q"] Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.425584 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"60bdb953-6735-4247-8287-16dbf4187c03","Type":"ContainerStarted","Data":"73f69fc948c092571d09249c9f44575e6480edb6e800d6d0f795fd3a77681bae"} Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.431631 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d06d9db6-d18e-42ca-a920-73579820a47f","Type":"ContainerStarted","Data":"755293e642718216a3b1ae12623dc1f9f276e756dd2c4b8b30168e0db4563959"} Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.444329 4763 generic.go:334] "Generic (PLEG): container finished" podID="19909ecc-64c8-4808-9447-c0ade391a43b" containerID="e517b181aa3ffdcd0ef26bd6858d4db9e4b09a586699b4d2f7537ea5e49ee60e" exitCode=0 Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.448581 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3e78-account-create-update-fdhhc" event={"ID":"19909ecc-64c8-4808-9447-c0ade391a43b","Type":"ContainerDied","Data":"e517b181aa3ffdcd0ef26bd6858d4db9e4b09a586699b4d2f7537ea5e49ee60e"} Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.453862 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"1a72cdbd-892b-459d-86e2-0dde31be5e39","Type":"ContainerStarted","Data":"c45b77f3d2bb42f55004bb3764c713527eed6f184b0936d3b7c461262808fa05"} Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.454679 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-horizon-secret-key\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.454723 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2qjd\" (UniqueName: \"kubernetes.io/projected/28f61e56-22f7-45cc-ba59-9624e668a73d-kube-api-access-t2qjd\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.454838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28f61e56-22f7-45cc-ba59-9624e668a73d-config-data\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.454874 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28f61e56-22f7-45cc-ba59-9624e668a73d-scripts\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.454894 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f61e56-22f7-45cc-ba59-9624e668a73d-logs\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.454927 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-horizon-tls-certs\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.454962 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-combined-ca-bundle\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.557348 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-horizon-secret-key\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.557927 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2qjd\" (UniqueName: \"kubernetes.io/projected/28f61e56-22f7-45cc-ba59-9624e668a73d-kube-api-access-t2qjd\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.558108 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28f61e56-22f7-45cc-ba59-9624e668a73d-config-data\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.558186 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28f61e56-22f7-45cc-ba59-9624e668a73d-scripts\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.558289 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f61e56-22f7-45cc-ba59-9624e668a73d-logs\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.558355 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-horizon-tls-certs\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.558500 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-combined-ca-bundle\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.565824 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28f61e56-22f7-45cc-ba59-9624e668a73d-scripts\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.566844 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28f61e56-22f7-45cc-ba59-9624e668a73d-logs\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.571007 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/28f61e56-22f7-45cc-ba59-9624e668a73d-config-data\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.575228 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-horizon-tls-certs\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.575872 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-horizon-secret-key\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.576438 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f61e56-22f7-45cc-ba59-9624e668a73d-combined-ca-bundle\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.593945 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2qjd\" (UniqueName: \"kubernetes.io/projected/28f61e56-22f7-45cc-ba59-9624e668a73d-kube-api-access-t2qjd\") pod \"horizon-5c998bbd96-rw26q\" (UID: \"28f61e56-22f7-45cc-ba59-9624e668a73d\") " pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:09 crc kubenswrapper[4763]: I1201 10:10:09.839603 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.118217 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-ldpps" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.169890 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=5.523857611 podStartE2EDuration="7.169497296s" podCreationTimestamp="2025-12-01 10:10:03 +0000 UTC" firstStartedPulling="2025-12-01 10:10:05.674976829 +0000 UTC m=+3322.943625597" lastFinishedPulling="2025-12-01 10:10:07.320616514 +0000 UTC m=+3324.589265282" observedRunningTime="2025-12-01 10:10:09.535166157 +0000 UTC m=+3326.803814925" watchObservedRunningTime="2025-12-01 10:10:10.169497296 +0000 UTC m=+3327.438146064" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.197058 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/017f4f6a-809c-4019-9645-0e24cb5f2827-operator-scripts\") pod \"017f4f6a-809c-4019-9645-0e24cb5f2827\" (UID: \"017f4f6a-809c-4019-9645-0e24cb5f2827\") " Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.197486 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7rp4\" (UniqueName: \"kubernetes.io/projected/017f4f6a-809c-4019-9645-0e24cb5f2827-kube-api-access-w7rp4\") pod \"017f4f6a-809c-4019-9645-0e24cb5f2827\" (UID: \"017f4f6a-809c-4019-9645-0e24cb5f2827\") " Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.198770 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/017f4f6a-809c-4019-9645-0e24cb5f2827-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "017f4f6a-809c-4019-9645-0e24cb5f2827" (UID: "017f4f6a-809c-4019-9645-0e24cb5f2827"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.207581 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017f4f6a-809c-4019-9645-0e24cb5f2827-kube-api-access-w7rp4" (OuterVolumeSpecName: "kube-api-access-w7rp4") pod "017f4f6a-809c-4019-9645-0e24cb5f2827" (UID: "017f4f6a-809c-4019-9645-0e24cb5f2827"). InnerVolumeSpecName "kube-api-access-w7rp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.299741 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/017f4f6a-809c-4019-9645-0e24cb5f2827-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.299772 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7rp4\" (UniqueName: \"kubernetes.io/projected/017f4f6a-809c-4019-9645-0e24cb5f2827-kube-api-access-w7rp4\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.382033 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79cd4dd7b6-cwmzr"] Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.477806 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c926de0-d99a-400f-8217-776bea2ca166","Type":"ContainerStarted","Data":"3102082a9bd1f3c1dac0164bd4c0ae395d155b8ab0eb3d3037ce3f10b6824178"} Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.478178 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4c926de0-d99a-400f-8217-776bea2ca166" containerName="glance-log" containerID="cri-o://1f21c0373285229e889c09878bc0460ce75704b7811ef595e98d68180a2e858b" gracePeriod=30 Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.478575 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4c926de0-d99a-400f-8217-776bea2ca166" containerName="glance-httpd" containerID="cri-o://3102082a9bd1f3c1dac0164bd4c0ae395d155b8ab0eb3d3037ce3f10b6824178" gracePeriod=30 Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.501492 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"60bdb953-6735-4247-8287-16dbf4187c03","Type":"ContainerStarted","Data":"b12734361480e1a7d464bcdd581f4fb15f56653705d71454563ccf91fa2c928c"} Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.515989 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.515969778 podStartE2EDuration="7.515969778s" podCreationTimestamp="2025-12-01 10:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:10:10.501171783 +0000 UTC m=+3327.769820541" watchObservedRunningTime="2025-12-01 10:10:10.515969778 +0000 UTC m=+3327.784618546" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.528443 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79cd4dd7b6-cwmzr" event={"ID":"aa0343e2-73aa-4c17-8f7c-9835bdba9977","Type":"ContainerStarted","Data":"6e3c423f62123e022affe69df503408f26c279989857772a1498204850e84433"} Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.544981 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=6.200123487 podStartE2EDuration="7.544963962s" podCreationTimestamp="2025-12-01 10:10:03 +0000 UTC" firstStartedPulling="2025-12-01 10:10:06.487699685 +0000 UTC m=+3323.756348453" lastFinishedPulling="2025-12-01 10:10:07.83254016 +0000 UTC m=+3325.101188928" observedRunningTime="2025-12-01 10:10:10.544221492 +0000 UTC m=+3327.812870260" watchObservedRunningTime="2025-12-01 10:10:10.544963962 +0000 UTC m=+3327.813612730" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.550074 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-ldpps" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.550950 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-ldpps" event={"ID":"017f4f6a-809c-4019-9645-0e24cb5f2827","Type":"ContainerDied","Data":"c85f1c7ecb691113552d764deae873088295f30bdb59a80cd25acfc79cf006dc"} Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.551001 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c85f1c7ecb691113552d764deae873088295f30bdb59a80cd25acfc79cf006dc" Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.627204 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c998bbd96-rw26q"] Dec 01 10:10:10 crc kubenswrapper[4763]: I1201 10:10:10.994849 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:10:10 crc kubenswrapper[4763]: E1201 10:10:10.995783 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.579762 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.642822 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d06d9db6-d18e-42ca-a920-73579820a47f","Type":"ContainerStarted","Data":"60c8d7393e534a5a84adf1336601d5f31eaca7a5b5dd5df471103fd36c8b5a62"} Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.645779 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3e78-account-create-update-fdhhc" event={"ID":"19909ecc-64c8-4808-9447-c0ade391a43b","Type":"ContainerDied","Data":"a96b397ca15ec412376672412b5e9888ee6cbca5a7421a58b0be9dba2f779a0f"} Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.646518 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a96b397ca15ec412376672412b5e9888ee6cbca5a7421a58b0be9dba2f779a0f" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.646640 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3e78-account-create-update-fdhhc" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.669023 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/19909ecc-64c8-4808-9447-c0ade391a43b-kube-api-access-j2b7z\") pod \"19909ecc-64c8-4808-9447-c0ade391a43b\" (UID: \"19909ecc-64c8-4808-9447-c0ade391a43b\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.669070 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19909ecc-64c8-4808-9447-c0ade391a43b-operator-scripts\") pod \"19909ecc-64c8-4808-9447-c0ade391a43b\" (UID: \"19909ecc-64c8-4808-9447-c0ade391a43b\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.670017 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19909ecc-64c8-4808-9447-c0ade391a43b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19909ecc-64c8-4808-9447-c0ade391a43b" (UID: "19909ecc-64c8-4808-9447-c0ade391a43b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.674361 4763 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19909ecc-64c8-4808-9447-c0ade391a43b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.677817 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c926de0-d99a-400f-8217-776bea2ca166" containerID="3102082a9bd1f3c1dac0164bd4c0ae395d155b8ab0eb3d3037ce3f10b6824178" exitCode=143 Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.677856 4763 generic.go:334] "Generic (PLEG): container finished" podID="4c926de0-d99a-400f-8217-776bea2ca166" containerID="1f21c0373285229e889c09878bc0460ce75704b7811ef595e98d68180a2e858b" exitCode=143 Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.677938 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c926de0-d99a-400f-8217-776bea2ca166","Type":"ContainerDied","Data":"3102082a9bd1f3c1dac0164bd4c0ae395d155b8ab0eb3d3037ce3f10b6824178"} Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.677975 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c926de0-d99a-400f-8217-776bea2ca166","Type":"ContainerDied","Data":"1f21c0373285229e889c09878bc0460ce75704b7811ef595e98d68180a2e858b"} Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.700155 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19909ecc-64c8-4808-9447-c0ade391a43b-kube-api-access-j2b7z" (OuterVolumeSpecName: "kube-api-access-j2b7z") pod "19909ecc-64c8-4808-9447-c0ade391a43b" (UID: "19909ecc-64c8-4808-9447-c0ade391a43b"). InnerVolumeSpecName "kube-api-access-j2b7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.716515 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c998bbd96-rw26q" event={"ID":"28f61e56-22f7-45cc-ba59-9624e668a73d","Type":"ContainerStarted","Data":"05afda844f945c9a31bba1a51b05ae66d6443d15cf951681541ce1ad01850381"} Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.770665 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.776022 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2b7z\" (UniqueName: \"kubernetes.io/projected/19909ecc-64c8-4808-9447-c0ade391a43b-kube-api-access-j2b7z\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.877933 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.878000 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-public-tls-certs\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.878030 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-config-data\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.878086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-ceph\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.878166 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-logs\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.878188 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-httpd-run\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.878222 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-combined-ca-bundle\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.878298 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-scripts\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.878331 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5swx\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-kube-api-access-l5swx\") pod \"4c926de0-d99a-400f-8217-776bea2ca166\" (UID: \"4c926de0-d99a-400f-8217-776bea2ca166\") " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.879192 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-logs" (OuterVolumeSpecName: "logs") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.879889 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.888720 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-kube-api-access-l5swx" (OuterVolumeSpecName: "kube-api-access-l5swx") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "kube-api-access-l5swx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.889799 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.891962 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-ceph" (OuterVolumeSpecName: "ceph") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.892923 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-scripts" (OuterVolumeSpecName: "scripts") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.957795 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.981840 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.981940 4763 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.981960 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.981972 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.981985 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c926de0-d99a-400f-8217-776bea2ca166-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.982009 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:11 crc kubenswrapper[4763]: I1201 10:10:11.982022 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5swx\" (UniqueName: \"kubernetes.io/projected/4c926de0-d99a-400f-8217-776bea2ca166-kube-api-access-l5swx\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.024562 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.048245 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-config-data" (OuterVolumeSpecName: "config-data") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.050710 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c926de0-d99a-400f-8217-776bea2ca166" (UID: "4c926de0-d99a-400f-8217-776bea2ca166"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.087315 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.088223 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.088240 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c926de0-d99a-400f-8217-776bea2ca166-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.741093 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4c926de0-d99a-400f-8217-776bea2ca166","Type":"ContainerDied","Data":"5b10f9bacd0a2005d4f2cae156a8ff0c0a32f73146640b8fbba57fbf4b1eec41"} Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.741140 4763 scope.go:117] "RemoveContainer" containerID="3102082a9bd1f3c1dac0164bd4c0ae395d155b8ab0eb3d3037ce3f10b6824178" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.741250 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.764700 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d06d9db6-d18e-42ca-a920-73579820a47f","Type":"ContainerStarted","Data":"73fdf1096e74fc248ba765ff0f0d79e839c63416dffb66221352ffbdb41ceefc"} Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.764868 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" containerName="glance-log" containerID="cri-o://60c8d7393e534a5a84adf1336601d5f31eaca7a5b5dd5df471103fd36c8b5a62" gracePeriod=30 Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.764993 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" containerName="glance-httpd" containerID="cri-o://73fdf1096e74fc248ba765ff0f0d79e839c63416dffb66221352ffbdb41ceefc" gracePeriod=30 Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.816563 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.816520917 podStartE2EDuration="5.816520917s" podCreationTimestamp="2025-12-01 10:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:10:12.807531121 +0000 UTC m=+3330.076179889" watchObservedRunningTime="2025-12-01 10:10:12.816520917 +0000 UTC m=+3330.085169685" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.852574 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.859933 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.868730 4763 scope.go:117] "RemoveContainer" containerID="1f21c0373285229e889c09878bc0460ce75704b7811ef595e98d68180a2e858b" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.899512 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:12 crc kubenswrapper[4763]: E1201 10:10:12.899910 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c926de0-d99a-400f-8217-776bea2ca166" containerName="glance-httpd" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.899925 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c926de0-d99a-400f-8217-776bea2ca166" containerName="glance-httpd" Dec 01 10:10:12 crc kubenswrapper[4763]: E1201 10:10:12.899948 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="017f4f6a-809c-4019-9645-0e24cb5f2827" containerName="mariadb-database-create" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.899955 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="017f4f6a-809c-4019-9645-0e24cb5f2827" containerName="mariadb-database-create" Dec 01 10:10:12 crc kubenswrapper[4763]: E1201 10:10:12.899971 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c926de0-d99a-400f-8217-776bea2ca166" containerName="glance-log" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.899977 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c926de0-d99a-400f-8217-776bea2ca166" containerName="glance-log" Dec 01 10:10:12 crc kubenswrapper[4763]: E1201 10:10:12.899991 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19909ecc-64c8-4808-9447-c0ade391a43b" containerName="mariadb-account-create-update" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.899997 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="19909ecc-64c8-4808-9447-c0ade391a43b" containerName="mariadb-account-create-update" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.900178 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c926de0-d99a-400f-8217-776bea2ca166" containerName="glance-httpd" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.900190 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="19909ecc-64c8-4808-9447-c0ade391a43b" containerName="mariadb-account-create-update" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.900200 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="017f4f6a-809c-4019-9645-0e24cb5f2827" containerName="mariadb-database-create" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.900212 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c926de0-d99a-400f-8217-776bea2ca166" containerName="glance-log" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.901138 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.906701 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.906884 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 01 10:10:12 crc kubenswrapper[4763]: I1201 10:10:12.921444 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.008309 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.008365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.008407 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-scripts\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.008433 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.024155 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-ceph\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.024321 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-config-data\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.024523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjdd8\" (UniqueName: \"kubernetes.io/projected/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-kube-api-access-jjdd8\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.024659 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.024738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-logs\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.051488 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c926de0-d99a-400f-8217-776bea2ca166" path="/var/lib/kubelet/pods/4c926de0-d99a-400f-8217-776bea2ca166/volumes" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.126648 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-config-data\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.126931 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjdd8\" (UniqueName: \"kubernetes.io/projected/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-kube-api-access-jjdd8\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.126991 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.127034 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-logs\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.127097 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.127118 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.127146 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-scripts\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.127163 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.127183 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-ceph\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.128227 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-logs\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.128263 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.161370 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-ceph\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.166380 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.170132 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjdd8\" (UniqueName: \"kubernetes.io/projected/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-kube-api-access-jjdd8\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.170779 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.186185 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.204173 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-config-data\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.205243 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f34e35d3-3a43-49ee-bee1-9ccc51135eb7-scripts\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.332827 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"f34e35d3-3a43-49ee-bee1-9ccc51135eb7\") " pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.373147 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.794170 4763 generic.go:334] "Generic (PLEG): container finished" podID="d06d9db6-d18e-42ca-a920-73579820a47f" containerID="73fdf1096e74fc248ba765ff0f0d79e839c63416dffb66221352ffbdb41ceefc" exitCode=143 Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.794546 4763 generic.go:334] "Generic (PLEG): container finished" podID="d06d9db6-d18e-42ca-a920-73579820a47f" containerID="60c8d7393e534a5a84adf1336601d5f31eaca7a5b5dd5df471103fd36c8b5a62" exitCode=143 Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.794256 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d06d9db6-d18e-42ca-a920-73579820a47f","Type":"ContainerDied","Data":"73fdf1096e74fc248ba765ff0f0d79e839c63416dffb66221352ffbdb41ceefc"} Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.794689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d06d9db6-d18e-42ca-a920-73579820a47f","Type":"ContainerDied","Data":"60c8d7393e534a5a84adf1336601d5f31eaca7a5b5dd5df471103fd36c8b5a62"} Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.870956 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.988783 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-httpd-run\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.989117 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-internal-tls-certs\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.989174 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-584bt\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-kube-api-access-584bt\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.989207 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-logs\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.989266 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-ceph\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.989285 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.989309 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-scripts\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.989372 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-combined-ca-bundle\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.990217 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-config-data\") pod \"d06d9db6-d18e-42ca-a920-73579820a47f\" (UID: \"d06d9db6-d18e-42ca-a920-73579820a47f\") " Dec 01 10:10:13 crc kubenswrapper[4763]: I1201 10:10:13.998155 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.000750 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.006514 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-logs" (OuterVolumeSpecName: "logs") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.034201 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-scripts" (OuterVolumeSpecName: "scripts") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.035202 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-ceph" (OuterVolumeSpecName: "ceph") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.036265 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-kube-api-access-584bt" (OuterVolumeSpecName: "kube-api-access-584bt") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "kube-api-access-584bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.091281 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.095930 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.097306 4763 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.097329 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-584bt\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-kube-api-access-584bt\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.097341 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d06d9db6-d18e-42ca-a920-73579820a47f-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.097352 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d06d9db6-d18e-42ca-a920-73579820a47f-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.097380 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.097389 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.115000 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-config-data" (OuterVolumeSpecName: "config-data") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.168421 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.191908 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.202895 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.202928 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.226766 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d06d9db6-d18e-42ca-a920-73579820a47f" (UID: "d06d9db6-d18e-42ca-a920-73579820a47f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.274185 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.304521 4763 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d06d9db6-d18e-42ca-a920-73579820a47f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.487996 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.567990 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-volume-volume1-0" podUID="1a72cdbd-892b-459d-86e2-0dde31be5e39" containerName="cinder-volume" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.730704 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-backup-0" podUID="60bdb953-6735-4247-8287-16dbf4187c03" containerName="cinder-backup" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.825717 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f34e35d3-3a43-49ee-bee1-9ccc51135eb7","Type":"ContainerStarted","Data":"56ea14970a2cdf02d586cd36562bc43aad16598af808e9819bbd6b07e67a814d"} Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.831884 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d06d9db6-d18e-42ca-a920-73579820a47f","Type":"ContainerDied","Data":"755293e642718216a3b1ae12623dc1f9f276e756dd2c4b8b30168e0db4563959"} Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.831935 4763 scope.go:117] "RemoveContainer" containerID="73fdf1096e74fc248ba765ff0f0d79e839c63416dffb66221352ffbdb41ceefc" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.832002 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.889595 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.907433 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.922966 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:14 crc kubenswrapper[4763]: E1201 10:10:14.923345 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" containerName="glance-log" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.923356 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" containerName="glance-log" Dec 01 10:10:14 crc kubenswrapper[4763]: E1201 10:10:14.923370 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" containerName="glance-httpd" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.923376 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" containerName="glance-httpd" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.923567 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" containerName="glance-httpd" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.923583 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" containerName="glance-log" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.924493 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.927806 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.929368 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.929673 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 01 10:10:14 crc kubenswrapper[4763]: I1201 10:10:14.973629 4763 scope.go:117] "RemoveContainer" containerID="60c8d7393e534a5a84adf1336601d5f31eaca7a5b5dd5df471103fd36c8b5a62" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.026816 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d06d9db6-d18e-42ca-a920-73579820a47f" path="/var/lib/kubelet/pods/d06d9db6-d18e-42ca-a920-73579820a47f/volumes" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028085 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028135 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028205 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-logs\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028223 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ktg\" (UniqueName: \"kubernetes.io/projected/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-kube-api-access-24ktg\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028267 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028307 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028324 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028341 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.028364 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131212 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131282 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131371 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-logs\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131393 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ktg\" (UniqueName: \"kubernetes.io/projected/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-kube-api-access-24ktg\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131533 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131581 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131630 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131649 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.131692 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.140119 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-logs\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.140835 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.142575 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.143640 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.144686 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.147362 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.163831 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.173757 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.180970 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ktg\" (UniqueName: \"kubernetes.io/projected/c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78-kube-api-access-24ktg\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.237203 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78\") " pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.301496 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.479768 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-qrvr5"] Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.481340 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.489585 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.501632 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-rslxr" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.525284 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-qrvr5"] Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.572567 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-config-data\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.572627 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmmhh\" (UniqueName: \"kubernetes.io/projected/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-kube-api-access-dmmhh\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.572842 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-job-config-data\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.572896 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-combined-ca-bundle\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.676834 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-job-config-data\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.676883 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-combined-ca-bundle\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.677219 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-config-data\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.677284 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmmhh\" (UniqueName: \"kubernetes.io/projected/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-kube-api-access-dmmhh\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.691771 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-config-data\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.702988 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-combined-ca-bundle\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.717191 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-job-config-data\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.723322 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmmhh\" (UniqueName: \"kubernetes.io/projected/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-kube-api-access-dmmhh\") pod \"manila-db-sync-qrvr5\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:15 crc kubenswrapper[4763]: I1201 10:10:15.899360 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-qrvr5" Dec 01 10:10:16 crc kubenswrapper[4763]: I1201 10:10:16.179577 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 01 10:10:16 crc kubenswrapper[4763]: I1201 10:10:16.576065 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-qrvr5"] Dec 01 10:10:16 crc kubenswrapper[4763]: W1201 10:10:16.603238 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b9fee43_2772_4498_8cd2_dc2ae1d8b9da.slice/crio-0f00e4bb660c482ccfd828f247f99d87d3489c0d5984adbe9441250299deb1e9 WatchSource:0}: Error finding container 0f00e4bb660c482ccfd828f247f99d87d3489c0d5984adbe9441250299deb1e9: Status 404 returned error can't find the container with id 0f00e4bb660c482ccfd828f247f99d87d3489c0d5984adbe9441250299deb1e9 Dec 01 10:10:16 crc kubenswrapper[4763]: I1201 10:10:16.869284 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-qrvr5" event={"ID":"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da","Type":"ContainerStarted","Data":"0f00e4bb660c482ccfd828f247f99d87d3489c0d5984adbe9441250299deb1e9"} Dec 01 10:10:16 crc kubenswrapper[4763]: I1201 10:10:16.874248 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78","Type":"ContainerStarted","Data":"bd0b76dbb7f7091bb8990375830bba62e34593c8faff29b415fe48680a49d153"} Dec 01 10:10:16 crc kubenswrapper[4763]: I1201 10:10:16.879468 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f34e35d3-3a43-49ee-bee1-9ccc51135eb7","Type":"ContainerStarted","Data":"38146252780cde48c4143b985a4dee6cc86d5ba2156df846d719755598c20bb4"} Dec 01 10:10:17 crc kubenswrapper[4763]: I1201 10:10:17.910893 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f34e35d3-3a43-49ee-bee1-9ccc51135eb7","Type":"ContainerStarted","Data":"73b092bf352cf9e698863eceb102a8100ae2ce3a734dea73dbd4f3ae0370a7c4"} Dec 01 10:10:17 crc kubenswrapper[4763]: I1201 10:10:17.915626 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78","Type":"ContainerStarted","Data":"88b259ef4214e4a31732d651e51ac02c722a53e43bb82c6e880c7428cc3f86a5"} Dec 01 10:10:17 crc kubenswrapper[4763]: I1201 10:10:17.943738 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.943718447 podStartE2EDuration="5.943718447s" podCreationTimestamp="2025-12-01 10:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:10:17.935930195 +0000 UTC m=+3335.204578963" watchObservedRunningTime="2025-12-01 10:10:17.943718447 +0000 UTC m=+3335.212367215" Dec 01 10:10:18 crc kubenswrapper[4763]: I1201 10:10:18.928856 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78","Type":"ContainerStarted","Data":"31daaa194b7b9955dfeee3396f6b4d0e75b86bbfe0746fdd08bcc6e9632402b9"} Dec 01 10:10:19 crc kubenswrapper[4763]: I1201 10:10:19.195797 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 01 10:10:19 crc kubenswrapper[4763]: I1201 10:10:19.228297 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.22825726 podStartE2EDuration="5.22825726s" podCreationTimestamp="2025-12-01 10:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:10:18.952112204 +0000 UTC m=+3336.220760972" watchObservedRunningTime="2025-12-01 10:10:19.22825726 +0000 UTC m=+3336.496906028" Dec 01 10:10:19 crc kubenswrapper[4763]: I1201 10:10:19.284077 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 01 10:10:23 crc kubenswrapper[4763]: I1201 10:10:23.003315 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:10:23 crc kubenswrapper[4763]: E1201 10:10:23.006415 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:10:23 crc kubenswrapper[4763]: I1201 10:10:23.374617 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 10:10:23 crc kubenswrapper[4763]: I1201 10:10:23.374953 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 01 10:10:23 crc kubenswrapper[4763]: I1201 10:10:23.410479 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 10:10:23 crc kubenswrapper[4763]: I1201 10:10:23.428393 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 01 10:10:23 crc kubenswrapper[4763]: I1201 10:10:23.981842 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 10:10:23 crc kubenswrapper[4763]: I1201 10:10:23.981921 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 01 10:10:25 crc kubenswrapper[4763]: I1201 10:10:25.302611 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:25 crc kubenswrapper[4763]: I1201 10:10:25.302679 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:25 crc kubenswrapper[4763]: I1201 10:10:25.346942 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:25 crc kubenswrapper[4763]: I1201 10:10:25.359068 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:26 crc kubenswrapper[4763]: I1201 10:10:26.005779 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:26 crc kubenswrapper[4763]: I1201 10:10:26.005817 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:27 crc kubenswrapper[4763]: I1201 10:10:27.637063 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 10:10:27 crc kubenswrapper[4763]: I1201 10:10:27.637208 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:10:27 crc kubenswrapper[4763]: I1201 10:10:27.641223 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 01 10:10:29 crc kubenswrapper[4763]: I1201 10:10:29.042507 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:29 crc kubenswrapper[4763]: I1201 10:10:29.042869 4763 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 01 10:10:29 crc kubenswrapper[4763]: I1201 10:10:29.076812 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 01 10:10:31 crc kubenswrapper[4763]: E1201 10:10:31.456884 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-manila-api:current-podified" Dec 01 10:10:31 crc kubenswrapper[4763]: E1201 10:10:31.457289 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manila-db-sync,Image:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,Command:[/bin/bash],Args:[-c sleep 0 && /usr/bin/manila-manage --config-dir /etc/manila/manila.conf.d db sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:job-config-data,ReadOnly:true,MountPath:/etc/manila/manila.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmmhh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42429,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42429,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-db-sync-qrvr5_openstack(9b9fee43-2772-4498-8cd2-dc2ae1d8b9da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:10:31 crc kubenswrapper[4763]: E1201 10:10:31.458560 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/manila-db-sync-qrvr5" podUID="9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" Dec 01 10:10:32 crc kubenswrapper[4763]: I1201 10:10:32.066227 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cc5dcd8bf-2zv5x" event={"ID":"799573bb-9ed0-43fc-9fc8-c5c5c2634e71","Type":"ContainerStarted","Data":"6f42b635e28ba2d75e7e5874b74e4c0ce1ff5e0942a216ec96291745c0dc6a37"} Dec 01 10:10:32 crc kubenswrapper[4763]: I1201 10:10:32.068121 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79cd4dd7b6-cwmzr" event={"ID":"aa0343e2-73aa-4c17-8f7c-9835bdba9977","Type":"ContainerStarted","Data":"23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a"} Dec 01 10:10:32 crc kubenswrapper[4763]: I1201 10:10:32.070580 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cc696d55-zpts8" event={"ID":"3d03700b-22aa-4d49-a92e-c5ca2baf9354","Type":"ContainerStarted","Data":"a3df93ac0d4a5496bb2eb091ea4ddb1a8d053041ea710ec774cb3e3d089fe0dc"} Dec 01 10:10:32 crc kubenswrapper[4763]: I1201 10:10:32.072602 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c998bbd96-rw26q" event={"ID":"28f61e56-22f7-45cc-ba59-9624e668a73d","Type":"ContainerStarted","Data":"7c2d2bf63c427440d20d5cab4bf4c76ad07ad9b2eb525fefe93a9d57f42339c9"} Dec 01 10:10:32 crc kubenswrapper[4763]: E1201 10:10:32.074354 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-manila-api:current-podified\\\"\"" pod="openstack/manila-db-sync-qrvr5" podUID="9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.083170 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cc696d55-zpts8" event={"ID":"3d03700b-22aa-4d49-a92e-c5ca2baf9354","Type":"ContainerStarted","Data":"abd8fcb13f2f213442cd8fb728d6fc3a1459be3147e6bd459485e85714e68ef3"} Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.083395 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77cc696d55-zpts8" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerName="horizon-log" containerID="cri-o://a3df93ac0d4a5496bb2eb091ea4ddb1a8d053041ea710ec774cb3e3d089fe0dc" gracePeriod=30 Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.085517 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77cc696d55-zpts8" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerName="horizon" containerID="cri-o://abd8fcb13f2f213442cd8fb728d6fc3a1459be3147e6bd459485e85714e68ef3" gracePeriod=30 Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.088167 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c998bbd96-rw26q" event={"ID":"28f61e56-22f7-45cc-ba59-9624e668a73d","Type":"ContainerStarted","Data":"b694ce560afa01f85cec40155f5d188d00476a44a9c3ef61d8464d493b783aa4"} Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.090520 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cc5dcd8bf-2zv5x" event={"ID":"799573bb-9ed0-43fc-9fc8-c5c5c2634e71","Type":"ContainerStarted","Data":"6f1452ec9e8516501382001156d3dc4be1cba78db4e79fbc5ff07e402d4f3563"} Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.090646 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cc5dcd8bf-2zv5x" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerName="horizon-log" containerID="cri-o://6f42b635e28ba2d75e7e5874b74e4c0ce1ff5e0942a216ec96291745c0dc6a37" gracePeriod=30 Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.090736 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cc5dcd8bf-2zv5x" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerName="horizon" containerID="cri-o://6f1452ec9e8516501382001156d3dc4be1cba78db4e79fbc5ff07e402d4f3563" gracePeriod=30 Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.097555 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79cd4dd7b6-cwmzr" event={"ID":"aa0343e2-73aa-4c17-8f7c-9835bdba9977","Type":"ContainerStarted","Data":"133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35"} Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.115012 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77cc696d55-zpts8" podStartSLOduration=3.532069044 podStartE2EDuration="28.114988956s" podCreationTimestamp="2025-12-01 10:10:05 +0000 UTC" firstStartedPulling="2025-12-01 10:10:06.914997061 +0000 UTC m=+3324.183645819" lastFinishedPulling="2025-12-01 10:10:31.497916963 +0000 UTC m=+3348.766565731" observedRunningTime="2025-12-01 10:10:33.109441114 +0000 UTC m=+3350.378089882" watchObservedRunningTime="2025-12-01 10:10:33.114988956 +0000 UTC m=+3350.383637724" Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.135783 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cc5dcd8bf-2zv5x" podStartSLOduration=4.218273668 podStartE2EDuration="29.135761525s" podCreationTimestamp="2025-12-01 10:10:04 +0000 UTC" firstStartedPulling="2025-12-01 10:10:06.615742033 +0000 UTC m=+3323.884390801" lastFinishedPulling="2025-12-01 10:10:31.53322989 +0000 UTC m=+3348.801878658" observedRunningTime="2025-12-01 10:10:33.127153649 +0000 UTC m=+3350.395802417" watchObservedRunningTime="2025-12-01 10:10:33.135761525 +0000 UTC m=+3350.404410293" Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.171993 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79cd4dd7b6-cwmzr" podStartSLOduration=4.125457464 podStartE2EDuration="25.171957427s" podCreationTimestamp="2025-12-01 10:10:08 +0000 UTC" firstStartedPulling="2025-12-01 10:10:10.45873737 +0000 UTC m=+3327.727386138" lastFinishedPulling="2025-12-01 10:10:31.505237333 +0000 UTC m=+3348.773886101" observedRunningTime="2025-12-01 10:10:33.155618289 +0000 UTC m=+3350.424267067" watchObservedRunningTime="2025-12-01 10:10:33.171957427 +0000 UTC m=+3350.440606195" Dec 01 10:10:33 crc kubenswrapper[4763]: I1201 10:10:33.190516 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c998bbd96-rw26q" podStartSLOduration=3.484995604 podStartE2EDuration="24.190501755s" podCreationTimestamp="2025-12-01 10:10:09 +0000 UTC" firstStartedPulling="2025-12-01 10:10:10.791031854 +0000 UTC m=+3328.059680612" lastFinishedPulling="2025-12-01 10:10:31.496538005 +0000 UTC m=+3348.765186763" observedRunningTime="2025-12-01 10:10:33.187933385 +0000 UTC m=+3350.456582153" watchObservedRunningTime="2025-12-01 10:10:33.190501755 +0000 UTC m=+3350.459150523" Dec 01 10:10:34 crc kubenswrapper[4763]: I1201 10:10:34.994486 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:10:34 crc kubenswrapper[4763]: E1201 10:10:34.995725 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:10:35 crc kubenswrapper[4763]: I1201 10:10:35.781284 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:10:35 crc kubenswrapper[4763]: I1201 10:10:35.905423 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:10:39 crc kubenswrapper[4763]: I1201 10:10:39.406393 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:39 crc kubenswrapper[4763]: I1201 10:10:39.406748 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:10:39 crc kubenswrapper[4763]: I1201 10:10:39.840786 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:39 crc kubenswrapper[4763]: I1201 10:10:39.841772 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:10:47 crc kubenswrapper[4763]: I1201 10:10:46.999406 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:10:47 crc kubenswrapper[4763]: E1201 10:10:47.000861 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:10:48 crc kubenswrapper[4763]: I1201 10:10:48.330004 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-qrvr5" event={"ID":"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da","Type":"ContainerStarted","Data":"e9c475c7206c829f3cfb2adc355bd659bb71faa2720e12a3ee7a7df69d3a377d"} Dec 01 10:10:48 crc kubenswrapper[4763]: I1201 10:10:48.354401 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-qrvr5" podStartSLOduration=3.293797649 podStartE2EDuration="33.354380923s" podCreationTimestamp="2025-12-01 10:10:15 +0000 UTC" firstStartedPulling="2025-12-01 10:10:16.607826398 +0000 UTC m=+3333.876475166" lastFinishedPulling="2025-12-01 10:10:46.668409672 +0000 UTC m=+3363.937058440" observedRunningTime="2025-12-01 10:10:48.348894012 +0000 UTC m=+3365.617542780" watchObservedRunningTime="2025-12-01 10:10:48.354380923 +0000 UTC m=+3365.623029691" Dec 01 10:10:49 crc kubenswrapper[4763]: I1201 10:10:49.409338 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79cd4dd7b6-cwmzr" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Dec 01 10:10:49 crc kubenswrapper[4763]: I1201 10:10:49.841482 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5c998bbd96-rw26q" podUID="28f61e56-22f7-45cc-ba59-9624e668a73d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Dec 01 10:11:01 crc kubenswrapper[4763]: I1201 10:11:01.457507 4763 generic.go:334] "Generic (PLEG): container finished" podID="9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" containerID="e9c475c7206c829f3cfb2adc355bd659bb71faa2720e12a3ee7a7df69d3a377d" exitCode=0 Dec 01 10:11:01 crc kubenswrapper[4763]: I1201 10:11:01.457668 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-qrvr5" event={"ID":"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da","Type":"ContainerDied","Data":"e9c475c7206c829f3cfb2adc355bd659bb71faa2720e12a3ee7a7df69d3a377d"} Dec 01 10:11:01 crc kubenswrapper[4763]: I1201 10:11:01.491349 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:11:01 crc kubenswrapper[4763]: I1201 10:11:01.657931 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:11:01 crc kubenswrapper[4763]: I1201 10:11:01.995092 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:11:01 crc kubenswrapper[4763]: E1201 10:11:01.995377 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.429591 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.493036 4763 generic.go:334] "Generic (PLEG): container finished" podID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerID="abd8fcb13f2f213442cd8fb728d6fc3a1459be3147e6bd459485e85714e68ef3" exitCode=137 Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.493064 4763 generic.go:334] "Generic (PLEG): container finished" podID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerID="a3df93ac0d4a5496bb2eb091ea4ddb1a8d053041ea710ec774cb3e3d089fe0dc" exitCode=137 Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.493103 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cc696d55-zpts8" event={"ID":"3d03700b-22aa-4d49-a92e-c5ca2baf9354","Type":"ContainerDied","Data":"abd8fcb13f2f213442cd8fb728d6fc3a1459be3147e6bd459485e85714e68ef3"} Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.493127 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cc696d55-zpts8" event={"ID":"3d03700b-22aa-4d49-a92e-c5ca2baf9354","Type":"ContainerDied","Data":"a3df93ac0d4a5496bb2eb091ea4ddb1a8d053041ea710ec774cb3e3d089fe0dc"} Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.500219 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-qrvr5" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.510368 4763 generic.go:334] "Generic (PLEG): container finished" podID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerID="6f1452ec9e8516501382001156d3dc4be1cba78db4e79fbc5ff07e402d4f3563" exitCode=137 Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.510397 4763 generic.go:334] "Generic (PLEG): container finished" podID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerID="6f42b635e28ba2d75e7e5874b74e4c0ce1ff5e0942a216ec96291745c0dc6a37" exitCode=137 Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.510437 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cc5dcd8bf-2zv5x" event={"ID":"799573bb-9ed0-43fc-9fc8-c5c5c2634e71","Type":"ContainerDied","Data":"6f1452ec9e8516501382001156d3dc4be1cba78db4e79fbc5ff07e402d4f3563"} Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.510478 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cc5dcd8bf-2zv5x" event={"ID":"799573bb-9ed0-43fc-9fc8-c5c5c2634e71","Type":"ContainerDied","Data":"6f42b635e28ba2d75e7e5874b74e4c0ce1ff5e0942a216ec96291745c0dc6a37"} Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.523968 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-qrvr5" event={"ID":"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da","Type":"ContainerDied","Data":"0f00e4bb660c482ccfd828f247f99d87d3489c0d5984adbe9441250299deb1e9"} Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.524211 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f00e4bb660c482ccfd828f247f99d87d3489c0d5984adbe9441250299deb1e9" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.524052 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-qrvr5" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.547431 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5c998bbd96-rw26q" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.622438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-job-config-data\") pod \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.622507 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmmhh\" (UniqueName: \"kubernetes.io/projected/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-kube-api-access-dmmhh\") pod \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.622534 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-combined-ca-bundle\") pod \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.622689 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-config-data\") pod \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\" (UID: \"9b9fee43-2772-4498-8cd2-dc2ae1d8b9da\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.635746 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-kube-api-access-dmmhh" (OuterVolumeSpecName: "kube-api-access-dmmhh") pod "9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" (UID: "9b9fee43-2772-4498-8cd2-dc2ae1d8b9da"). InnerVolumeSpecName "kube-api-access-dmmhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.646730 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-config-data" (OuterVolumeSpecName: "config-data") pod "9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" (UID: "9b9fee43-2772-4498-8cd2-dc2ae1d8b9da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.673160 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" (UID: "9b9fee43-2772-4498-8cd2-dc2ae1d8b9da"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.691801 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79cd4dd7b6-cwmzr"] Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.692012 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79cd4dd7b6-cwmzr" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon-log" containerID="cri-o://23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a" gracePeriod=30 Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.692139 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79cd4dd7b6-cwmzr" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon" containerID="cri-o://133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35" gracePeriod=30 Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.695747 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" (UID: "9b9fee43-2772-4498-8cd2-dc2ae1d8b9da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.709553 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.723779 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d03700b-22aa-4d49-a92e-c5ca2baf9354-logs\") pod \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.723820 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmr6j\" (UniqueName: \"kubernetes.io/projected/3d03700b-22aa-4d49-a92e-c5ca2baf9354-kube-api-access-vmr6j\") pod \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.723887 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-scripts\") pod \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.723923 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-config-data\") pod \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.724005 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3d03700b-22aa-4d49-a92e-c5ca2baf9354-horizon-secret-key\") pod \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\" (UID: \"3d03700b-22aa-4d49-a92e-c5ca2baf9354\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.724264 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.724275 4763 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.724285 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmmhh\" (UniqueName: \"kubernetes.io/projected/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-kube-api-access-dmmhh\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.724295 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.729754 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d03700b-22aa-4d49-a92e-c5ca2baf9354-logs" (OuterVolumeSpecName: "logs") pod "3d03700b-22aa-4d49-a92e-c5ca2baf9354" (UID: "3d03700b-22aa-4d49-a92e-c5ca2baf9354"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.739803 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d03700b-22aa-4d49-a92e-c5ca2baf9354-kube-api-access-vmr6j" (OuterVolumeSpecName: "kube-api-access-vmr6j") pod "3d03700b-22aa-4d49-a92e-c5ca2baf9354" (UID: "3d03700b-22aa-4d49-a92e-c5ca2baf9354"). InnerVolumeSpecName "kube-api-access-vmr6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.745083 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d03700b-22aa-4d49-a92e-c5ca2baf9354-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3d03700b-22aa-4d49-a92e-c5ca2baf9354" (UID: "3d03700b-22aa-4d49-a92e-c5ca2baf9354"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.782779 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-scripts" (OuterVolumeSpecName: "scripts") pod "3d03700b-22aa-4d49-a92e-c5ca2baf9354" (UID: "3d03700b-22aa-4d49-a92e-c5ca2baf9354"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.783564 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-config-data" (OuterVolumeSpecName: "config-data") pod "3d03700b-22aa-4d49-a92e-c5ca2baf9354" (UID: "3d03700b-22aa-4d49-a92e-c5ca2baf9354"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.822034 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.826575 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpph5\" (UniqueName: \"kubernetes.io/projected/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-kube-api-access-zpph5\") pod \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.826639 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-scripts\") pod \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.826751 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-config-data\") pod \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.826850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-horizon-secret-key\") pod \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.826886 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-logs\") pod \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\" (UID: \"799573bb-9ed0-43fc-9fc8-c5c5c2634e71\") " Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.827306 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d03700b-22aa-4d49-a92e-c5ca2baf9354-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.827324 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmr6j\" (UniqueName: \"kubernetes.io/projected/3d03700b-22aa-4d49-a92e-c5ca2baf9354-kube-api-access-vmr6j\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.827334 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.827342 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3d03700b-22aa-4d49-a92e-c5ca2baf9354-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.827352 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3d03700b-22aa-4d49-a92e-c5ca2baf9354-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.828002 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-logs" (OuterVolumeSpecName: "logs") pod "799573bb-9ed0-43fc-9fc8-c5c5c2634e71" (UID: "799573bb-9ed0-43fc-9fc8-c5c5c2634e71"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.830331 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-kube-api-access-zpph5" (OuterVolumeSpecName: "kube-api-access-zpph5") pod "799573bb-9ed0-43fc-9fc8-c5c5c2634e71" (UID: "799573bb-9ed0-43fc-9fc8-c5c5c2634e71"). InnerVolumeSpecName "kube-api-access-zpph5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.839994 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "799573bb-9ed0-43fc-9fc8-c5c5c2634e71" (UID: "799573bb-9ed0-43fc-9fc8-c5c5c2634e71"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.856400 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-config-data" (OuterVolumeSpecName: "config-data") pod "799573bb-9ed0-43fc-9fc8-c5c5c2634e71" (UID: "799573bb-9ed0-43fc-9fc8-c5c5c2634e71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.864448 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-scripts" (OuterVolumeSpecName: "scripts") pod "799573bb-9ed0-43fc-9fc8-c5c5c2634e71" (UID: "799573bb-9ed0-43fc-9fc8-c5c5c2634e71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.928936 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.928963 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.928973 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpph5\" (UniqueName: \"kubernetes.io/projected/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-kube-api-access-zpph5\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.928982 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:03 crc kubenswrapper[4763]: I1201 10:11:03.928993 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/799573bb-9ed0-43fc-9fc8-c5c5c2634e71-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.535904 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77cc696d55-zpts8" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.537488 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77cc696d55-zpts8" event={"ID":"3d03700b-22aa-4d49-a92e-c5ca2baf9354","Type":"ContainerDied","Data":"c78c39099d0557d06e89f7dd29b561800769589bcf03a6cf8fa185ba050a23ce"} Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.537666 4763 scope.go:117] "RemoveContainer" containerID="abd8fcb13f2f213442cd8fb728d6fc3a1459be3147e6bd459485e85714e68ef3" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.538310 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cc5dcd8bf-2zv5x" event={"ID":"799573bb-9ed0-43fc-9fc8-c5c5c2634e71","Type":"ContainerDied","Data":"fa49d1c837fdac5d4494c03fe2aaae02f0d901f7715b2a098c63b0836d590eed"} Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.538494 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cc5dcd8bf-2zv5x" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.582884 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cc5dcd8bf-2zv5x"] Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.596828 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cc5dcd8bf-2zv5x"] Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.609155 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77cc696d55-zpts8"] Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.617840 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77cc696d55-zpts8"] Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.723524 4763 scope.go:117] "RemoveContainer" containerID="a3df93ac0d4a5496bb2eb091ea4ddb1a8d053041ea710ec774cb3e3d089fe0dc" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.753628 4763 scope.go:117] "RemoveContainer" containerID="6f1452ec9e8516501382001156d3dc4be1cba78db4e79fbc5ff07e402d4f3563" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.754435 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:04 crc kubenswrapper[4763]: E1201 10:11:04.754949 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerName="horizon" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.754975 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerName="horizon" Dec 01 10:11:04 crc kubenswrapper[4763]: E1201 10:11:04.754998 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerName="horizon-log" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755007 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerName="horizon-log" Dec 01 10:11:04 crc kubenswrapper[4763]: E1201 10:11:04.755033 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerName="horizon-log" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755042 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerName="horizon-log" Dec 01 10:11:04 crc kubenswrapper[4763]: E1201 10:11:04.755055 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerName="horizon" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755061 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerName="horizon" Dec 01 10:11:04 crc kubenswrapper[4763]: E1201 10:11:04.755079 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" containerName="manila-db-sync" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755087 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" containerName="manila-db-sync" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755282 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerName="horizon" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755302 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerName="horizon" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755315 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" containerName="horizon-log" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755337 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" containerName="horizon-log" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.755351 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" containerName="manila-db-sync" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.756714 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.767801 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-rslxr" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.768099 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.768223 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.768935 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.799640 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.897536 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.907751 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.924810 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.925794 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.969871 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.969935 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mmc8\" (UniqueName: \"kubernetes.io/projected/796d5724-e2a3-4fb6-9346-5a93b34e385a-kube-api-access-8mmc8\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.969974 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-scripts\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.970056 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.970074 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:04 crc kubenswrapper[4763]: I1201 10:11:04.970127 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/796d5724-e2a3-4fb6-9346-5a93b34e385a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.029716 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d03700b-22aa-4d49-a92e-c5ca2baf9354" path="/var/lib/kubelet/pods/3d03700b-22aa-4d49-a92e-c5ca2baf9354/volumes" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.030517 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="799573bb-9ed0-43fc-9fc8-c5c5c2634e71" path="/var/lib/kubelet/pods/799573bb-9ed0-43fc-9fc8-c5c5c2634e71/volumes" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.072783 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-scripts\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.072886 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.072935 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-ceph\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.072966 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mmc8\" (UniqueName: \"kubernetes.io/projected/796d5724-e2a3-4fb6-9346-5a93b34e385a-kube-api-access-8mmc8\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073004 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073033 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-scripts\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073063 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt26k\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-kube-api-access-tt26k\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073097 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073137 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073232 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073288 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073337 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/796d5724-e2a3-4fb6-9346-5a93b34e385a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.073404 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.074175 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/796d5724-e2a3-4fb6-9346-5a93b34e385a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.082659 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-scripts\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.089411 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.091059 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.107040 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.118235 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mmc8\" (UniqueName: \"kubernetes.io/projected/796d5724-e2a3-4fb6-9346-5a93b34e385a-kube-api-access-8mmc8\") pod \"manila-scheduler-0\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.151208 4763 scope.go:117] "RemoveContainer" containerID="6f42b635e28ba2d75e7e5874b74e4c0ce1ff5e0942a216ec96291745c0dc6a37" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.176274 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-ceph\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.176351 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.176387 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt26k\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-kube-api-access-tt26k\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.176622 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.176659 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.176724 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.176789 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.176830 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-scripts\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.177475 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c846ff5b9-2xtfs"] Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.179105 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.179591 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.179771 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.180246 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-ceph\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.194202 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-scripts\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.195022 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.202420 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c846ff5b9-2xtfs"] Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.204115 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.205805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.225050 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt26k\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-kube-api-access-tt26k\") pod \"manila-share-share1-0\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.266222 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.273340 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.275040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.276894 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.279255 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-scripts\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.279418 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.279523 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.279628 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-config\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.279728 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-logs\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.279825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data-custom\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.279932 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-ovsdbserver-sb\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.280016 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-ovsdbserver-nb\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.280118 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-dns-svc\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.280203 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75rmd\" (UniqueName: \"kubernetes.io/projected/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-kube-api-access-75rmd\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.280365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-etc-machine-id\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.280444 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9qnf\" (UniqueName: \"kubernetes.io/projected/34405493-5281-4822-b8f1-11e68aa61470-kube-api-access-q9qnf\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.280549 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.302137 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.384946 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-ovsdbserver-nb\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385009 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-dns-svc\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385040 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75rmd\" (UniqueName: \"kubernetes.io/projected/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-kube-api-access-75rmd\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385112 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-etc-machine-id\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385132 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9qnf\" (UniqueName: \"kubernetes.io/projected/34405493-5281-4822-b8f1-11e68aa61470-kube-api-access-q9qnf\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385167 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385204 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-scripts\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385223 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385250 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385286 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-config\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385321 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-logs\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385358 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data-custom\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.385402 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-ovsdbserver-sb\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.388416 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.389004 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-ovsdbserver-nb\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.389529 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-dns-svc\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.389795 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-etc-machine-id\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.393214 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-ovsdbserver-sb\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.393448 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34405493-5281-4822-b8f1-11e68aa61470-config\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.393751 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.394413 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-logs\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.400982 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-scripts\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.401386 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.403799 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.407272 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data-custom\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.412465 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9qnf\" (UniqueName: \"kubernetes.io/projected/34405493-5281-4822-b8f1-11e68aa61470-kube-api-access-q9qnf\") pod \"dnsmasq-dns-5c846ff5b9-2xtfs\" (UID: \"34405493-5281-4822-b8f1-11e68aa61470\") " pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.419123 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75rmd\" (UniqueName: \"kubernetes.io/projected/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-kube-api-access-75rmd\") pod \"manila-api-0\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.451995 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 10:11:05 crc kubenswrapper[4763]: I1201 10:11:05.667133 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:06 crc kubenswrapper[4763]: I1201 10:11:06.094166 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:06 crc kubenswrapper[4763]: I1201 10:11:06.252754 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:06 crc kubenswrapper[4763]: I1201 10:11:06.365555 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c846ff5b9-2xtfs"] Dec 01 10:11:06 crc kubenswrapper[4763]: W1201 10:11:06.399562 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34405493_5281_4822_b8f1_11e68aa61470.slice/crio-6b2673ce2c9fd841e786f5e068639fd52971f0b55414091b9bd2a56475e024af WatchSource:0}: Error finding container 6b2673ce2c9fd841e786f5e068639fd52971f0b55414091b9bd2a56475e024af: Status 404 returned error can't find the container with id 6b2673ce2c9fd841e786f5e068639fd52971f0b55414091b9bd2a56475e024af Dec 01 10:11:06 crc kubenswrapper[4763]: W1201 10:11:06.540314 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod877b11fd_fc0a_4bb0_9f2e_7d09d3078d18.slice/crio-75670e6bd44a824ee2bfe3eb552d40b3578cedfeb51a16877d2e9a9e2728801e WatchSource:0}: Error finding container 75670e6bd44a824ee2bfe3eb552d40b3578cedfeb51a16877d2e9a9e2728801e: Status 404 returned error can't find the container with id 75670e6bd44a824ee2bfe3eb552d40b3578cedfeb51a16877d2e9a9e2728801e Dec 01 10:11:06 crc kubenswrapper[4763]: I1201 10:11:06.540368 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:06 crc kubenswrapper[4763]: I1201 10:11:06.589404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"796d5724-e2a3-4fb6-9346-5a93b34e385a","Type":"ContainerStarted","Data":"498fbdb6d35528ae6b8bc487b39ae21a9bf3ba598369e5fb9ffb25c1834a35a3"} Dec 01 10:11:06 crc kubenswrapper[4763]: I1201 10:11:06.593076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" event={"ID":"34405493-5281-4822-b8f1-11e68aa61470","Type":"ContainerStarted","Data":"6b2673ce2c9fd841e786f5e068639fd52971f0b55414091b9bd2a56475e024af"} Dec 01 10:11:06 crc kubenswrapper[4763]: I1201 10:11:06.599757 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ee391497-0e27-412d-9209-a5c47226a435","Type":"ContainerStarted","Data":"917d60a72c1990814f2d1a0b6bf444c5e2fa5d61c3a7ec2c7794316eec28c367"} Dec 01 10:11:06 crc kubenswrapper[4763]: I1201 10:11:06.609675 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18","Type":"ContainerStarted","Data":"75670e6bd44a824ee2bfe3eb552d40b3578cedfeb51a16877d2e9a9e2728801e"} Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:07.622641 4763 generic.go:334] "Generic (PLEG): container finished" podID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerID="133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35" exitCode=0 Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:07.622682 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79cd4dd7b6-cwmzr" event={"ID":"aa0343e2-73aa-4c17-8f7c-9835bdba9977","Type":"ContainerDied","Data":"133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35"} Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:07.631411 4763 generic.go:334] "Generic (PLEG): container finished" podID="34405493-5281-4822-b8f1-11e68aa61470" containerID="638726f475866e00d60a2b7b3d58716a8a53dc3ab08d2692007c918a25ae578c" exitCode=0 Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:07.631797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" event={"ID":"34405493-5281-4822-b8f1-11e68aa61470","Type":"ContainerDied","Data":"638726f475866e00d60a2b7b3d58716a8a53dc3ab08d2692007c918a25ae578c"} Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:07.647689 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18","Type":"ContainerStarted","Data":"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f"} Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:08.688764 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18","Type":"ContainerStarted","Data":"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657"} Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:08.689103 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:08.710504 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"796d5724-e2a3-4fb6-9346-5a93b34e385a","Type":"ContainerStarted","Data":"eb4643355b67ca85bf7fd1bc1bb6fe692b4f388fbc00b0a65593b3c080768642"} Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:08.711018 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.711008747 podStartE2EDuration="3.711008747s" podCreationTimestamp="2025-12-01 10:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:11:08.706103992 +0000 UTC m=+3385.974752760" watchObservedRunningTime="2025-12-01 10:11:08.711008747 +0000 UTC m=+3385.979657515" Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:08.724322 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" event={"ID":"34405493-5281-4822-b8f1-11e68aa61470","Type":"ContainerStarted","Data":"1a2cea67a7178a6873fcdc07e4e2c4fd82df7f30c946e5fcc746f9e7bfb54306"} Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:08.724620 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:08 crc kubenswrapper[4763]: I1201 10:11:08.754914 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" podStartSLOduration=3.754898109 podStartE2EDuration="3.754898109s" podCreationTimestamp="2025-12-01 10:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:11:08.751532997 +0000 UTC m=+3386.020181755" watchObservedRunningTime="2025-12-01 10:11:08.754898109 +0000 UTC m=+3386.023546877" Dec 01 10:11:09 crc kubenswrapper[4763]: I1201 10:11:09.346515 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:09 crc kubenswrapper[4763]: I1201 10:11:09.406941 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79cd4dd7b6-cwmzr" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Dec 01 10:11:09 crc kubenswrapper[4763]: I1201 10:11:09.739499 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"796d5724-e2a3-4fb6-9346-5a93b34e385a","Type":"ContainerStarted","Data":"a811288e2224e5ba2242375d747591a0903f1c95e51f6cd267a9b8619070e020"} Dec 01 10:11:09 crc kubenswrapper[4763]: I1201 10:11:09.760501 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.015383335 podStartE2EDuration="5.760482549s" podCreationTimestamp="2025-12-01 10:11:04 +0000 UTC" firstStartedPulling="2025-12-01 10:11:06.130574519 +0000 UTC m=+3383.399223287" lastFinishedPulling="2025-12-01 10:11:06.875673733 +0000 UTC m=+3384.144322501" observedRunningTime="2025-12-01 10:11:09.758416312 +0000 UTC m=+3387.027065090" watchObservedRunningTime="2025-12-01 10:11:09.760482549 +0000 UTC m=+3387.029131317" Dec 01 10:11:10 crc kubenswrapper[4763]: I1201 10:11:10.745329 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerName="manila-api-log" containerID="cri-o://adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f" gracePeriod=30 Dec 01 10:11:10 crc kubenswrapper[4763]: I1201 10:11:10.745363 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerName="manila-api" containerID="cri-o://b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657" gracePeriod=30 Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.606947 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.635930 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data\") pod \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.636065 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-combined-ca-bundle\") pod \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.636086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75rmd\" (UniqueName: \"kubernetes.io/projected/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-kube-api-access-75rmd\") pod \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.636123 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-logs\") pod \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.636198 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data-custom\") pod \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.636242 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-scripts\") pod \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.636334 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-etc-machine-id\") pod \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\" (UID: \"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18\") " Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.636873 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" (UID: "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.637195 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-logs" (OuterVolumeSpecName: "logs") pod "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" (UID: "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.643424 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-kube-api-access-75rmd" (OuterVolumeSpecName: "kube-api-access-75rmd") pod "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" (UID: "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18"). InnerVolumeSpecName "kube-api-access-75rmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.645033 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" (UID: "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.653025 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-scripts" (OuterVolumeSpecName: "scripts") pod "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" (UID: "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.691772 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" (UID: "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.737685 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data" (OuterVolumeSpecName: "config-data") pod "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" (UID: "877b11fd-fc0a-4bb0-9f2e-7d09d3078d18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.742923 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.742947 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.742956 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.742972 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75rmd\" (UniqueName: \"kubernetes.io/projected/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-kube-api-access-75rmd\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.742985 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.742994 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.743005 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.760514 4763 generic.go:334] "Generic (PLEG): container finished" podID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerID="b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657" exitCode=0 Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.760553 4763 generic.go:334] "Generic (PLEG): container finished" podID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerID="adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f" exitCode=143 Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.760573 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18","Type":"ContainerDied","Data":"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657"} Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.760598 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18","Type":"ContainerDied","Data":"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f"} Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.760610 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"877b11fd-fc0a-4bb0-9f2e-7d09d3078d18","Type":"ContainerDied","Data":"75670e6bd44a824ee2bfe3eb552d40b3578cedfeb51a16877d2e9a9e2728801e"} Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.760611 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.760632 4763 scope.go:117] "RemoveContainer" containerID="b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.799503 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.804182 4763 scope.go:117] "RemoveContainer" containerID="adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.811407 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.852378 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:11 crc kubenswrapper[4763]: E1201 10:11:11.852805 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerName="manila-api-log" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.852818 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerName="manila-api-log" Dec 01 10:11:11 crc kubenswrapper[4763]: E1201 10:11:11.852859 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerName="manila-api" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.852867 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerName="manila-api" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.853033 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerName="manila-api" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.853049 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" containerName="manila-api-log" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.856382 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.863406 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.869033 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.871085 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.872659 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.945730 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.945813 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-public-tls-certs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.945844 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-scripts\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.945874 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-config-data\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.945892 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.945915 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg279\" (UniqueName: \"kubernetes.io/projected/6730d033-63cf-46f2-b779-e751663b7735-kube-api-access-wg279\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.945957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-config-data-custom\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.945989 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6730d033-63cf-46f2-b779-e751663b7735-etc-machine-id\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:11 crc kubenswrapper[4763]: I1201 10:11:11.946011 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6730d033-63cf-46f2-b779-e751663b7735-logs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.008083 4763 scope.go:117] "RemoveContainer" containerID="b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657" Dec 01 10:11:12 crc kubenswrapper[4763]: E1201 10:11:12.009044 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657\": container with ID starting with b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657 not found: ID does not exist" containerID="b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.009086 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657"} err="failed to get container status \"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657\": rpc error: code = NotFound desc = could not find container \"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657\": container with ID starting with b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657 not found: ID does not exist" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.009114 4763 scope.go:117] "RemoveContainer" containerID="adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f" Dec 01 10:11:12 crc kubenswrapper[4763]: E1201 10:11:12.009578 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f\": container with ID starting with adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f not found: ID does not exist" containerID="adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.009616 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f"} err="failed to get container status \"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f\": rpc error: code = NotFound desc = could not find container \"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f\": container with ID starting with adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f not found: ID does not exist" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.009638 4763 scope.go:117] "RemoveContainer" containerID="b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.010103 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657"} err="failed to get container status \"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657\": rpc error: code = NotFound desc = could not find container \"b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657\": container with ID starting with b3693b5eab112035bccd6c63a0c284b096308948c7a28e2b38f15d371463f657 not found: ID does not exist" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.010124 4763 scope.go:117] "RemoveContainer" containerID="adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.010372 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f"} err="failed to get container status \"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f\": rpc error: code = NotFound desc = could not find container \"adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f\": container with ID starting with adc30cb4da03ee6ce4aba47357f0741e0bc62f88c93dad4b0c2b270edcee631f not found: ID does not exist" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.047788 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.047847 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-public-tls-certs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.047874 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-scripts\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.047906 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-config-data\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.047921 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.047948 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg279\" (UniqueName: \"kubernetes.io/projected/6730d033-63cf-46f2-b779-e751663b7735-kube-api-access-wg279\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.048003 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-config-data-custom\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.048048 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6730d033-63cf-46f2-b779-e751663b7735-etc-machine-id\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.048082 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6730d033-63cf-46f2-b779-e751663b7735-logs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.048596 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6730d033-63cf-46f2-b779-e751663b7735-logs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.053301 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6730d033-63cf-46f2-b779-e751663b7735-etc-machine-id\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.057952 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-public-tls-certs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.061667 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.062303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.065976 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-config-data\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.067112 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-config-data-custom\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.076836 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6730d033-63cf-46f2-b779-e751663b7735-scripts\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.078541 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg279\" (UniqueName: \"kubernetes.io/projected/6730d033-63cf-46f2-b779-e751663b7735-kube-api-access-wg279\") pod \"manila-api-0\" (UID: \"6730d033-63cf-46f2-b779-e751663b7735\") " pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.298016 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.918528 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.918825 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="ceilometer-central-agent" containerID="cri-o://1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf" gracePeriod=30 Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.919268 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="proxy-httpd" containerID="cri-o://31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0" gracePeriod=30 Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.919314 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="sg-core" containerID="cri-o://1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e" gracePeriod=30 Dec 01 10:11:12 crc kubenswrapper[4763]: I1201 10:11:12.919351 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="ceilometer-notification-agent" containerID="cri-o://80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2" gracePeriod=30 Dec 01 10:11:13 crc kubenswrapper[4763]: I1201 10:11:13.007644 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877b11fd-fc0a-4bb0-9f2e-7d09d3078d18" path="/var/lib/kubelet/pods/877b11fd-fc0a-4bb0-9f2e-7d09d3078d18/volumes" Dec 01 10:11:13 crc kubenswrapper[4763]: E1201 10:11:13.439009 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75e5bac3_096d_42dd_b1f8_19c03774fb1c.slice/crio-conmon-1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75e5bac3_096d_42dd_b1f8_19c03774fb1c.slice/crio-1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:11:13 crc kubenswrapper[4763]: I1201 10:11:13.793947 4763 generic.go:334] "Generic (PLEG): container finished" podID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerID="31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0" exitCode=0 Dec 01 10:11:13 crc kubenswrapper[4763]: I1201 10:11:13.793975 4763 generic.go:334] "Generic (PLEG): container finished" podID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerID="1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e" exitCode=2 Dec 01 10:11:13 crc kubenswrapper[4763]: I1201 10:11:13.793982 4763 generic.go:334] "Generic (PLEG): container finished" podID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerID="1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf" exitCode=0 Dec 01 10:11:13 crc kubenswrapper[4763]: I1201 10:11:13.793999 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerDied","Data":"31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0"} Dec 01 10:11:13 crc kubenswrapper[4763]: I1201 10:11:13.794023 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerDied","Data":"1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e"} Dec 01 10:11:13 crc kubenswrapper[4763]: I1201 10:11:13.794034 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerDied","Data":"1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf"} Dec 01 10:11:13 crc kubenswrapper[4763]: I1201 10:11:13.994252 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:11:13 crc kubenswrapper[4763]: E1201 10:11:13.994830 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:11:14 crc kubenswrapper[4763]: I1201 10:11:14.956928 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gmnqb"] Dec 01 10:11:14 crc kubenswrapper[4763]: I1201 10:11:14.959505 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.010162 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-utilities\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.010593 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6598\" (UniqueName: \"kubernetes.io/projected/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-kube-api-access-z6598\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.011181 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-catalog-content\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.029743 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmnqb"] Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.114586 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6598\" (UniqueName: \"kubernetes.io/projected/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-kube-api-access-z6598\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.114769 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-catalog-content\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.114812 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-utilities\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.116197 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-utilities\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.119863 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-catalog-content\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.137408 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6598\" (UniqueName: \"kubernetes.io/projected/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-kube-api-access-z6598\") pod \"redhat-marketplace-gmnqb\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.280906 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.394749 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.671121 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c846ff5b9-2xtfs" Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.792408 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-nksxv"] Dec 01 10:11:15 crc kubenswrapper[4763]: I1201 10:11:15.792706 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" podUID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" containerName="dnsmasq-dns" containerID="cri-o://d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267" gracePeriod=10 Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.657697 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmnqb"] Dec 01 10:11:16 crc kubenswrapper[4763]: W1201 10:11:16.716331 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d904c0f_8eb4_4cb4_8904_18ceb0ab8e70.slice/crio-f99364a6fdc84dc17efcba3a50336142aaa5ef2b2c033d070bf08dbe7c8b1645 WatchSource:0}: Error finding container f99364a6fdc84dc17efcba3a50336142aaa5ef2b2c033d070bf08dbe7c8b1645: Status 404 returned error can't find the container with id f99364a6fdc84dc17efcba3a50336142aaa5ef2b2c033d070bf08dbe7c8b1645 Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.766383 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.845181 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmnqb" event={"ID":"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70","Type":"ContainerStarted","Data":"f99364a6fdc84dc17efcba3a50336142aaa5ef2b2c033d070bf08dbe7c8b1645"} Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.861623 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-sb\") pod \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.861735 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-openstack-edpm-ipam\") pod \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.862005 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tfbm\" (UniqueName: \"kubernetes.io/projected/a50257f6-6461-4ee3-b40b-4f56fe98dfad-kube-api-access-2tfbm\") pod \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.862044 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-nb\") pod \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.862104 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-dns-svc\") pod \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.862149 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-config\") pod \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\" (UID: \"a50257f6-6461-4ee3-b40b-4f56fe98dfad\") " Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.909797 4763 generic.go:334] "Generic (PLEG): container finished" podID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" containerID="d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267" exitCode=0 Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.909840 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" event={"ID":"a50257f6-6461-4ee3-b40b-4f56fe98dfad","Type":"ContainerDied","Data":"d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267"} Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.909865 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" event={"ID":"a50257f6-6461-4ee3-b40b-4f56fe98dfad","Type":"ContainerDied","Data":"74ea43238c9e8e7b43cb4cfc8129870192e0570c1801205226fa6b6f14f41d28"} Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.909882 4763 scope.go:117] "RemoveContainer" containerID="d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267" Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.910034 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667ff9c869-nksxv" Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.912125 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a50257f6-6461-4ee3-b40b-4f56fe98dfad-kube-api-access-2tfbm" (OuterVolumeSpecName: "kube-api-access-2tfbm") pod "a50257f6-6461-4ee3-b40b-4f56fe98dfad" (UID: "a50257f6-6461-4ee3-b40b-4f56fe98dfad"). InnerVolumeSpecName "kube-api-access-2tfbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.921203 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.955236 4763 scope.go:117] "RemoveContainer" containerID="d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad" Dec 01 10:11:16 crc kubenswrapper[4763]: I1201 10:11:16.964016 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tfbm\" (UniqueName: \"kubernetes.io/projected/a50257f6-6461-4ee3-b40b-4f56fe98dfad-kube-api-access-2tfbm\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.029988 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a50257f6-6461-4ee3-b40b-4f56fe98dfad" (UID: "a50257f6-6461-4ee3-b40b-4f56fe98dfad"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.044658 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a50257f6-6461-4ee3-b40b-4f56fe98dfad" (UID: "a50257f6-6461-4ee3-b40b-4f56fe98dfad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.044875 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a50257f6-6461-4ee3-b40b-4f56fe98dfad" (UID: "a50257f6-6461-4ee3-b40b-4f56fe98dfad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.045372 4763 scope.go:117] "RemoveContainer" containerID="d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267" Dec 01 10:11:17 crc kubenswrapper[4763]: E1201 10:11:17.050182 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267\": container with ID starting with d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267 not found: ID does not exist" containerID="d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.050235 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267"} err="failed to get container status \"d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267\": rpc error: code = NotFound desc = could not find container \"d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267\": container with ID starting with d2009f9121d090c55a38a7e3c3f38bf4dae800a23c516fe8fbace0127abc3267 not found: ID does not exist" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.050257 4763 scope.go:117] "RemoveContainer" containerID="d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad" Dec 01 10:11:17 crc kubenswrapper[4763]: E1201 10:11:17.050714 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad\": container with ID starting with d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad not found: ID does not exist" containerID="d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.050742 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad"} err="failed to get container status \"d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad\": rpc error: code = NotFound desc = could not find container \"d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad\": container with ID starting with d70ea71e957bb1fdea73f39eb9ef3e15788cfdee1e2b467b296db56fce6470ad not found: ID does not exist" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.069580 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.069612 4763 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.069625 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.088333 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a50257f6-6461-4ee3-b40b-4f56fe98dfad" (UID: "a50257f6-6461-4ee3-b40b-4f56fe98dfad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.131064 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-config" (OuterVolumeSpecName: "config") pod "a50257f6-6461-4ee3-b40b-4f56fe98dfad" (UID: "a50257f6-6461-4ee3-b40b-4f56fe98dfad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.171971 4763 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.171998 4763 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a50257f6-6461-4ee3-b40b-4f56fe98dfad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.383533 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-nksxv"] Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.408329 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-nksxv"] Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.975214 4763 generic.go:334] "Generic (PLEG): container finished" podID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerID="9ce883adae2a2835b50f561d1c0e5d356ff155e2fd52c7fd1053b4070548f8e9" exitCode=0 Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.975567 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmnqb" event={"ID":"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70","Type":"ContainerDied","Data":"9ce883adae2a2835b50f561d1c0e5d356ff155e2fd52c7fd1053b4070548f8e9"} Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.985508 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.991763 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5qhw"] Dec 01 10:11:17 crc kubenswrapper[4763]: E1201 10:11:17.992146 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" containerName="init" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.992157 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" containerName="init" Dec 01 10:11:17 crc kubenswrapper[4763]: E1201 10:11:17.992187 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" containerName="dnsmasq-dns" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.992193 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" containerName="dnsmasq-dns" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.992385 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" containerName="dnsmasq-dns" Dec 01 10:11:17 crc kubenswrapper[4763]: I1201 10:11:17.998126 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.010921 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ee391497-0e27-412d-9209-a5c47226a435","Type":"ContainerStarted","Data":"a38d9a73157dfd349ab1871d5908adbd4ee5c88d4299d05c864baa9d8d7b8d2c"} Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.044800 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6730d033-63cf-46f2-b779-e751663b7735","Type":"ContainerStarted","Data":"1691f3bf4389466136c7290f47de763e5a6d56361acad74854b754cf89bae066"} Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.046115 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6730d033-63cf-46f2-b779-e751663b7735","Type":"ContainerStarted","Data":"25b882dffa8b3a73929b89293bb484e9f361cde5a6d24028c11d76c7eabbcb7b"} Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.050236 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5qhw"] Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.105759 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-catalog-content\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.105973 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-utilities\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.106014 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7nns\" (UniqueName: \"kubernetes.io/projected/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-kube-api-access-x7nns\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.207412 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-utilities\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.207472 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7nns\" (UniqueName: \"kubernetes.io/projected/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-kube-api-access-x7nns\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.207565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-catalog-content\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.208016 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-catalog-content\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.208232 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-utilities\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.238388 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7nns\" (UniqueName: \"kubernetes.io/projected/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-kube-api-access-x7nns\") pod \"redhat-operators-w5qhw\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.341486 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:18 crc kubenswrapper[4763]: I1201 10:11:18.974861 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5qhw"] Dec 01 10:11:19 crc kubenswrapper[4763]: I1201 10:11:19.011089 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a50257f6-6461-4ee3-b40b-4f56fe98dfad" path="/var/lib/kubelet/pods/a50257f6-6461-4ee3-b40b-4f56fe98dfad/volumes" Dec 01 10:11:19 crc kubenswrapper[4763]: I1201 10:11:19.087146 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5qhw" event={"ID":"9f5677a7-52b7-41f2-b50a-aabcca5dd27c","Type":"ContainerStarted","Data":"5fe832b3b6f95af257dca9a024ba5f1d8f482dbfc314bdc48521009a334ca569"} Dec 01 10:11:19 crc kubenswrapper[4763]: I1201 10:11:19.128934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ee391497-0e27-412d-9209-a5c47226a435","Type":"ContainerStarted","Data":"8ffb02cab29352248be1ccce837f009618746c468d3723ded3f641a8568cb70d"} Dec 01 10:11:19 crc kubenswrapper[4763]: I1201 10:11:19.142317 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6730d033-63cf-46f2-b779-e751663b7735","Type":"ContainerStarted","Data":"6a57ff71d4954e88f1b0770e4c71bfb447ba766e555e2a782e6ca2642f009966"} Dec 01 10:11:19 crc kubenswrapper[4763]: I1201 10:11:19.142528 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 01 10:11:19 crc kubenswrapper[4763]: I1201 10:11:19.198874 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.25934364 podStartE2EDuration="15.188725507s" podCreationTimestamp="2025-12-01 10:11:04 +0000 UTC" firstStartedPulling="2025-12-01 10:11:06.285743671 +0000 UTC m=+3383.554392439" lastFinishedPulling="2025-12-01 10:11:16.215125538 +0000 UTC m=+3393.483774306" observedRunningTime="2025-12-01 10:11:19.184071969 +0000 UTC m=+3396.452720737" watchObservedRunningTime="2025-12-01 10:11:19.188725507 +0000 UTC m=+3396.457374275" Dec 01 10:11:19 crc kubenswrapper[4763]: I1201 10:11:19.225845 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=8.225819843 podStartE2EDuration="8.225819843s" podCreationTimestamp="2025-12-01 10:11:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:11:19.214582875 +0000 UTC m=+3396.483231643" watchObservedRunningTime="2025-12-01 10:11:19.225819843 +0000 UTC m=+3396.494468611" Dec 01 10:11:19 crc kubenswrapper[4763]: I1201 10:11:19.406527 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79cd4dd7b6-cwmzr" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Dec 01 10:11:20 crc kubenswrapper[4763]: I1201 10:11:20.152490 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerID="5d9fb1b6275f2b6e09fec053aed2c81a419a0f6d21eb6f33af6388cde8d4c39f" exitCode=0 Dec 01 10:11:20 crc kubenswrapper[4763]: I1201 10:11:20.152591 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5qhw" event={"ID":"9f5677a7-52b7-41f2-b50a-aabcca5dd27c","Type":"ContainerDied","Data":"5d9fb1b6275f2b6e09fec053aed2c81a419a0f6d21eb6f33af6388cde8d4c39f"} Dec 01 10:11:20 crc kubenswrapper[4763]: I1201 10:11:20.158118 4763 generic.go:334] "Generic (PLEG): container finished" podID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerID="607666506f61f2b8a617c14b988aa367118eaea23b4c9b9d5a8bc7a7acedc3a2" exitCode=0 Dec 01 10:11:20 crc kubenswrapper[4763]: I1201 10:11:20.159065 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmnqb" event={"ID":"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70","Type":"ContainerDied","Data":"607666506f61f2b8a617c14b988aa367118eaea23b4c9b9d5a8bc7a7acedc3a2"} Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.801980 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.888447 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t46cz\" (UniqueName: \"kubernetes.io/projected/75e5bac3-096d-42dd-b1f8-19c03774fb1c-kube-api-access-t46cz\") pod \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.888555 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-log-httpd\") pod \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.888625 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-config-data\") pod \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.888708 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-scripts\") pod \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.888806 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-combined-ca-bundle\") pod \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.888850 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-sg-core-conf-yaml\") pod \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.888882 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-run-httpd\") pod \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.888959 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-ceilometer-tls-certs\") pod \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\" (UID: \"75e5bac3-096d-42dd-b1f8-19c03774fb1c\") " Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.892359 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "75e5bac3-096d-42dd-b1f8-19c03774fb1c" (UID: "75e5bac3-096d-42dd-b1f8-19c03774fb1c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.892270 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "75e5bac3-096d-42dd-b1f8-19c03774fb1c" (UID: "75e5bac3-096d-42dd-b1f8-19c03774fb1c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.923340 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-scripts" (OuterVolumeSpecName: "scripts") pod "75e5bac3-096d-42dd-b1f8-19c03774fb1c" (UID: "75e5bac3-096d-42dd-b1f8-19c03774fb1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.923614 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e5bac3-096d-42dd-b1f8-19c03774fb1c-kube-api-access-t46cz" (OuterVolumeSpecName: "kube-api-access-t46cz") pod "75e5bac3-096d-42dd-b1f8-19c03774fb1c" (UID: "75e5bac3-096d-42dd-b1f8-19c03774fb1c"). InnerVolumeSpecName "kube-api-access-t46cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.961618 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "75e5bac3-096d-42dd-b1f8-19c03774fb1c" (UID: "75e5bac3-096d-42dd-b1f8-19c03774fb1c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.987241 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "75e5bac3-096d-42dd-b1f8-19c03774fb1c" (UID: "75e5bac3-096d-42dd-b1f8-19c03774fb1c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.991638 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t46cz\" (UniqueName: \"kubernetes.io/projected/75e5bac3-096d-42dd-b1f8-19c03774fb1c-kube-api-access-t46cz\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.991684 4763 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.991697 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.991709 4763 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.991722 4763 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75e5bac3-096d-42dd-b1f8-19c03774fb1c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:21 crc kubenswrapper[4763]: I1201 10:11:21.991733 4763 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.001793 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75e5bac3-096d-42dd-b1f8-19c03774fb1c" (UID: "75e5bac3-096d-42dd-b1f8-19c03774fb1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.034642 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-config-data" (OuterVolumeSpecName: "config-data") pod "75e5bac3-096d-42dd-b1f8-19c03774fb1c" (UID: "75e5bac3-096d-42dd-b1f8-19c03774fb1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.095242 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.095274 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75e5bac3-096d-42dd-b1f8-19c03774fb1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.188249 4763 generic.go:334] "Generic (PLEG): container finished" podID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerID="80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2" exitCode=0 Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.188364 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerDied","Data":"80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2"} Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.188421 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75e5bac3-096d-42dd-b1f8-19c03774fb1c","Type":"ContainerDied","Data":"31c14605f1ffe1886f54bf07784492cdcd4de94049cc402a9d1401b41b276ae0"} Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.188499 4763 scope.go:117] "RemoveContainer" containerID="31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.188803 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.194065 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmnqb" event={"ID":"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70","Type":"ContainerStarted","Data":"0dc9c4197bff4aeed48696cdc704350957c2e21aec7155fcabc283de6d5f6e90"} Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.229752 4763 scope.go:117] "RemoveContainer" containerID="1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.236472 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gmnqb" podStartSLOduration=4.997033076 podStartE2EDuration="8.236439276s" podCreationTimestamp="2025-12-01 10:11:14 +0000 UTC" firstStartedPulling="2025-12-01 10:11:17.984674659 +0000 UTC m=+3395.253323427" lastFinishedPulling="2025-12-01 10:11:21.224080859 +0000 UTC m=+3398.492729627" observedRunningTime="2025-12-01 10:11:22.217140887 +0000 UTC m=+3399.485789665" watchObservedRunningTime="2025-12-01 10:11:22.236439276 +0000 UTC m=+3399.505088044" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.250592 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.265686 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.294486 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:11:22 crc kubenswrapper[4763]: E1201 10:11:22.294914 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="ceilometer-central-agent" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.294928 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="ceilometer-central-agent" Dec 01 10:11:22 crc kubenswrapper[4763]: E1201 10:11:22.294948 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="proxy-httpd" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.294954 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="proxy-httpd" Dec 01 10:11:22 crc kubenswrapper[4763]: E1201 10:11:22.294967 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="sg-core" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.294973 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="sg-core" Dec 01 10:11:22 crc kubenswrapper[4763]: E1201 10:11:22.294984 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="ceilometer-notification-agent" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.294990 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="ceilometer-notification-agent" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.295156 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="ceilometer-central-agent" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.295167 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="proxy-httpd" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.295178 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="sg-core" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.295196 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" containerName="ceilometer-notification-agent" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.297609 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.300759 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.301071 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.301186 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.331536 4763 scope.go:117] "RemoveContainer" containerID="80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.353558 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.389328 4763 scope.go:117] "RemoveContainer" containerID="1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.402397 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.402786 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.402929 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcb70e3b-87a0-49ff-8946-e182808bf846-log-httpd\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.403034 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.403169 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcb70e3b-87a0-49ff-8946-e182808bf846-run-httpd\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.403257 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4krs\" (UniqueName: \"kubernetes.io/projected/fcb70e3b-87a0-49ff-8946-e182808bf846-kube-api-access-s4krs\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.403350 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-scripts\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.403471 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-config-data\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.415146 4763 scope.go:117] "RemoveContainer" containerID="31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0" Dec 01 10:11:22 crc kubenswrapper[4763]: E1201 10:11:22.415712 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0\": container with ID starting with 31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0 not found: ID does not exist" containerID="31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.415805 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0"} err="failed to get container status \"31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0\": rpc error: code = NotFound desc = could not find container \"31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0\": container with ID starting with 31246d3aaf175f3e7c6f97d363f256f50ccf73233526d315c811d69a47d49fc0 not found: ID does not exist" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.415884 4763 scope.go:117] "RemoveContainer" containerID="1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e" Dec 01 10:11:22 crc kubenswrapper[4763]: E1201 10:11:22.416318 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e\": container with ID starting with 1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e not found: ID does not exist" containerID="1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.416542 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e"} err="failed to get container status \"1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e\": rpc error: code = NotFound desc = could not find container \"1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e\": container with ID starting with 1c280f764e47f0c7ec352f2cce180b26d14f5f3177ce8f04018e330b51e04e1e not found: ID does not exist" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.416601 4763 scope.go:117] "RemoveContainer" containerID="80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2" Dec 01 10:11:22 crc kubenswrapper[4763]: E1201 10:11:22.416930 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2\": container with ID starting with 80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2 not found: ID does not exist" containerID="80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.416968 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2"} err="failed to get container status \"80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2\": rpc error: code = NotFound desc = could not find container \"80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2\": container with ID starting with 80999c29f6eac927a1ee800fa1178dd629fcd2ee0120aed511188d7bb49abfe2 not found: ID does not exist" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.416986 4763 scope.go:117] "RemoveContainer" containerID="1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf" Dec 01 10:11:22 crc kubenswrapper[4763]: E1201 10:11:22.417655 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf\": container with ID starting with 1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf not found: ID does not exist" containerID="1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.417678 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf"} err="failed to get container status \"1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf\": rpc error: code = NotFound desc = could not find container \"1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf\": container with ID starting with 1e15065ce24c794d135f880dcd35e6556f82e6c4bbdbe11fc34ef8f7224ee6bf not found: ID does not exist" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.506767 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcb70e3b-87a0-49ff-8946-e182808bf846-log-httpd\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.507262 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.507360 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcb70e3b-87a0-49ff-8946-e182808bf846-log-httpd\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.507654 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcb70e3b-87a0-49ff-8946-e182808bf846-run-httpd\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.507740 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4krs\" (UniqueName: \"kubernetes.io/projected/fcb70e3b-87a0-49ff-8946-e182808bf846-kube-api-access-s4krs\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.507829 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-scripts\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.507921 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-config-data\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.508807 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.508943 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.507966 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fcb70e3b-87a0-49ff-8946-e182808bf846-run-httpd\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.534303 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.534888 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.535218 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-config-data\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.539886 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-scripts\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.540898 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb70e3b-87a0-49ff-8946-e182808bf846-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.550874 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4krs\" (UniqueName: \"kubernetes.io/projected/fcb70e3b-87a0-49ff-8946-e182808bf846-kube-api-access-s4krs\") pod \"ceilometer-0\" (UID: \"fcb70e3b-87a0-49ff-8946-e182808bf846\") " pod="openstack/ceilometer-0" Dec 01 10:11:22 crc kubenswrapper[4763]: I1201 10:11:22.622707 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 01 10:11:23 crc kubenswrapper[4763]: I1201 10:11:23.017693 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e5bac3-096d-42dd-b1f8-19c03774fb1c" path="/var/lib/kubelet/pods/75e5bac3-096d-42dd-b1f8-19c03774fb1c/volumes" Dec 01 10:11:23 crc kubenswrapper[4763]: W1201 10:11:23.219652 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcb70e3b_87a0_49ff_8946_e182808bf846.slice/crio-922cf1756f6a0a96a4daad3534f15e1ecb4e18599623e2ce53e0826569693ffa WatchSource:0}: Error finding container 922cf1756f6a0a96a4daad3534f15e1ecb4e18599623e2ce53e0826569693ffa: Status 404 returned error can't find the container with id 922cf1756f6a0a96a4daad3534f15e1ecb4e18599623e2ce53e0826569693ffa Dec 01 10:11:23 crc kubenswrapper[4763]: I1201 10:11:23.225582 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 01 10:11:24 crc kubenswrapper[4763]: I1201 10:11:24.229753 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcb70e3b-87a0-49ff-8946-e182808bf846","Type":"ContainerStarted","Data":"a86e0e8500a01640a6db8d4aee4bd70a92d1a15f1c3ca65cf1cf0afaa2f29728"} Dec 01 10:11:24 crc kubenswrapper[4763]: I1201 10:11:24.230016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcb70e3b-87a0-49ff-8946-e182808bf846","Type":"ContainerStarted","Data":"922cf1756f6a0a96a4daad3534f15e1ecb4e18599623e2ce53e0826569693ffa"} Dec 01 10:11:25 crc kubenswrapper[4763]: I1201 10:11:25.244700 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcb70e3b-87a0-49ff-8946-e182808bf846","Type":"ContainerStarted","Data":"d3709638477d1c2c370b19d0b67a32468c7b5b08c0a052dea7e80b7b8ae0c9e5"} Dec 01 10:11:25 crc kubenswrapper[4763]: I1201 10:11:25.269086 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 01 10:11:25 crc kubenswrapper[4763]: I1201 10:11:25.282230 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:25 crc kubenswrapper[4763]: I1201 10:11:25.282667 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:25 crc kubenswrapper[4763]: I1201 10:11:25.341950 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:26 crc kubenswrapper[4763]: I1201 10:11:26.257607 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcb70e3b-87a0-49ff-8946-e182808bf846","Type":"ContainerStarted","Data":"98d7220860e23e4fe11da8528f2c5db9636d253c19d20f40b575c8962c692504"} Dec 01 10:11:26 crc kubenswrapper[4763]: I1201 10:11:26.326892 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:26 crc kubenswrapper[4763]: I1201 10:11:26.743860 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmnqb"] Dec 01 10:11:26 crc kubenswrapper[4763]: I1201 10:11:26.995615 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:11:26 crc kubenswrapper[4763]: E1201 10:11:26.996215 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:11:27 crc kubenswrapper[4763]: I1201 10:11:27.475182 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 01 10:11:27 crc kubenswrapper[4763]: I1201 10:11:27.573231 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:28 crc kubenswrapper[4763]: I1201 10:11:28.286345 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerName="manila-scheduler" containerID="cri-o://eb4643355b67ca85bf7fd1bc1bb6fe692b4f388fbc00b0a65593b3c080768642" gracePeriod=30 Dec 01 10:11:28 crc kubenswrapper[4763]: I1201 10:11:28.287762 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fcb70e3b-87a0-49ff-8946-e182808bf846","Type":"ContainerStarted","Data":"b21452fbf9f09ae62e5d9290d57e9a391fd9140529d6cb1cc7e12210aa5a5fb1"} Dec 01 10:11:28 crc kubenswrapper[4763]: I1201 10:11:28.287802 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 01 10:11:28 crc kubenswrapper[4763]: I1201 10:11:28.287950 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gmnqb" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerName="registry-server" containerID="cri-o://0dc9c4197bff4aeed48696cdc704350957c2e21aec7155fcabc283de6d5f6e90" gracePeriod=2 Dec 01 10:11:28 crc kubenswrapper[4763]: I1201 10:11:28.288267 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerName="probe" containerID="cri-o://a811288e2224e5ba2242375d747591a0903f1c95e51f6cd267a9b8619070e020" gracePeriod=30 Dec 01 10:11:29 crc kubenswrapper[4763]: I1201 10:11:29.299399 4763 generic.go:334] "Generic (PLEG): container finished" podID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerID="a811288e2224e5ba2242375d747591a0903f1c95e51f6cd267a9b8619070e020" exitCode=0 Dec 01 10:11:29 crc kubenswrapper[4763]: I1201 10:11:29.299481 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"796d5724-e2a3-4fb6-9346-5a93b34e385a","Type":"ContainerDied","Data":"a811288e2224e5ba2242375d747591a0903f1c95e51f6cd267a9b8619070e020"} Dec 01 10:11:29 crc kubenswrapper[4763]: I1201 10:11:29.307317 4763 generic.go:334] "Generic (PLEG): container finished" podID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerID="0dc9c4197bff4aeed48696cdc704350957c2e21aec7155fcabc283de6d5f6e90" exitCode=0 Dec 01 10:11:29 crc kubenswrapper[4763]: I1201 10:11:29.307404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmnqb" event={"ID":"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70","Type":"ContainerDied","Data":"0dc9c4197bff4aeed48696cdc704350957c2e21aec7155fcabc283de6d5f6e90"} Dec 01 10:11:29 crc kubenswrapper[4763]: I1201 10:11:29.406593 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79cd4dd7b6-cwmzr" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Dec 01 10:11:29 crc kubenswrapper[4763]: I1201 10:11:29.406693 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:11:29 crc kubenswrapper[4763]: I1201 10:11:29.443900 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.323856572 podStartE2EDuration="7.443875669s" podCreationTimestamp="2025-12-01 10:11:22 +0000 UTC" firstStartedPulling="2025-12-01 10:11:23.224601398 +0000 UTC m=+3400.493250166" lastFinishedPulling="2025-12-01 10:11:27.344620495 +0000 UTC m=+3404.613269263" observedRunningTime="2025-12-01 10:11:28.319254687 +0000 UTC m=+3405.587903455" watchObservedRunningTime="2025-12-01 10:11:29.443875669 +0000 UTC m=+3406.712524437" Dec 01 10:11:31 crc kubenswrapper[4763]: I1201 10:11:31.330617 4763 generic.go:334] "Generic (PLEG): container finished" podID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerID="eb4643355b67ca85bf7fd1bc1bb6fe692b4f388fbc00b0a65593b3c080768642" exitCode=0 Dec 01 10:11:31 crc kubenswrapper[4763]: I1201 10:11:31.330664 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"796d5724-e2a3-4fb6-9346-5a93b34e385a","Type":"ContainerDied","Data":"eb4643355b67ca85bf7fd1bc1bb6fe692b4f388fbc00b0a65593b3c080768642"} Dec 01 10:11:33 crc kubenswrapper[4763]: I1201 10:11:33.843837 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:33 crc kubenswrapper[4763]: I1201 10:11:33.863605 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 10:11:33 crc kubenswrapper[4763]: I1201 10:11:33.962400 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6598\" (UniqueName: \"kubernetes.io/projected/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-kube-api-access-z6598\") pod \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " Dec 01 10:11:33 crc kubenswrapper[4763]: I1201 10:11:33.962722 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-catalog-content\") pod \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " Dec 01 10:11:33 crc kubenswrapper[4763]: I1201 10:11:33.962852 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-utilities\") pod \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\" (UID: \"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70\") " Dec 01 10:11:33 crc kubenswrapper[4763]: I1201 10:11:33.966586 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-utilities" (OuterVolumeSpecName: "utilities") pod "1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" (UID: "1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.000692 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" (UID: "1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.043367 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0343e2_73aa_4c17_8f7c_9835bdba9977.slice/crio-conmon-23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0343e2_73aa_4c17_8f7c_9835bdba9977.slice/crio-23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.065051 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-scripts\") pod \"796d5724-e2a3-4fb6-9346-5a93b34e385a\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.065086 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data\") pod \"796d5724-e2a3-4fb6-9346-5a93b34e385a\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.065193 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mmc8\" (UniqueName: \"kubernetes.io/projected/796d5724-e2a3-4fb6-9346-5a93b34e385a-kube-api-access-8mmc8\") pod \"796d5724-e2a3-4fb6-9346-5a93b34e385a\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.065335 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/796d5724-e2a3-4fb6-9346-5a93b34e385a-etc-machine-id\") pod \"796d5724-e2a3-4fb6-9346-5a93b34e385a\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.065377 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data-custom\") pod \"796d5724-e2a3-4fb6-9346-5a93b34e385a\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.065634 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-combined-ca-bundle\") pod \"796d5724-e2a3-4fb6-9346-5a93b34e385a\" (UID: \"796d5724-e2a3-4fb6-9346-5a93b34e385a\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.066050 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.066061 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.077127 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/796d5724-e2a3-4fb6-9346-5a93b34e385a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "796d5724-e2a3-4fb6-9346-5a93b34e385a" (UID: "796d5724-e2a3-4fb6-9346-5a93b34e385a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.167861 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/796d5724-e2a3-4fb6-9346-5a93b34e385a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.198703 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.288124 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/796d5724-e2a3-4fb6-9346-5a93b34e385a-kube-api-access-8mmc8" (OuterVolumeSpecName: "kube-api-access-8mmc8") pod "796d5724-e2a3-4fb6-9346-5a93b34e385a" (UID: "796d5724-e2a3-4fb6-9346-5a93b34e385a"). InnerVolumeSpecName "kube-api-access-8mmc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.288773 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-kube-api-access-z6598" (OuterVolumeSpecName: "kube-api-access-z6598") pod "1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" (UID: "1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70"). InnerVolumeSpecName "kube-api-access-z6598". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.291468 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "796d5724-e2a3-4fb6-9346-5a93b34e385a" (UID: "796d5724-e2a3-4fb6-9346-5a93b34e385a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.297593 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-scripts" (OuterVolumeSpecName: "scripts") pod "796d5724-e2a3-4fb6-9346-5a93b34e385a" (UID: "796d5724-e2a3-4fb6-9346-5a93b34e385a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.342771 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "796d5724-e2a3-4fb6-9346-5a93b34e385a" (UID: "796d5724-e2a3-4fb6-9346-5a93b34e385a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.371049 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0343e2-73aa-4c17-8f7c-9835bdba9977-logs\") pod \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.371159 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-combined-ca-bundle\") pod \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.371185 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcjb8\" (UniqueName: \"kubernetes.io/projected/aa0343e2-73aa-4c17-8f7c-9835bdba9977-kube-api-access-fcjb8\") pod \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.371417 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-secret-key\") pod \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.371486 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-config-data\") pod \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.371521 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-scripts\") pod \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.371571 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-tls-certs\") pod \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\" (UID: \"aa0343e2-73aa-4c17-8f7c-9835bdba9977\") " Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.371987 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.372003 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.372016 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mmc8\" (UniqueName: \"kubernetes.io/projected/796d5724-e2a3-4fb6-9346-5a93b34e385a-kube-api-access-8mmc8\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.372029 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.372042 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6598\" (UniqueName: \"kubernetes.io/projected/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70-kube-api-access-z6598\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.384351 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa0343e2-73aa-4c17-8f7c-9835bdba9977-logs" (OuterVolumeSpecName: "logs") pod "aa0343e2-73aa-4c17-8f7c-9835bdba9977" (UID: "aa0343e2-73aa-4c17-8f7c-9835bdba9977"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.386304 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"796d5724-e2a3-4fb6-9346-5a93b34e385a","Type":"ContainerDied","Data":"498fbdb6d35528ae6b8bc487b39ae21a9bf3ba598369e5fb9ffb25c1834a35a3"} Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.386349 4763 scope.go:117] "RemoveContainer" containerID="a811288e2224e5ba2242375d747591a0903f1c95e51f6cd267a9b8619070e020" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.386345 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.388705 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0343e2-73aa-4c17-8f7c-9835bdba9977-kube-api-access-fcjb8" (OuterVolumeSpecName: "kube-api-access-fcjb8") pod "aa0343e2-73aa-4c17-8f7c-9835bdba9977" (UID: "aa0343e2-73aa-4c17-8f7c-9835bdba9977"). InnerVolumeSpecName "kube-api-access-fcjb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.388872 4763 generic.go:334] "Generic (PLEG): container finished" podID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerID="23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a" exitCode=137 Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.388948 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79cd4dd7b6-cwmzr" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.388895 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79cd4dd7b6-cwmzr" event={"ID":"aa0343e2-73aa-4c17-8f7c-9835bdba9977","Type":"ContainerDied","Data":"23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a"} Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.388998 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79cd4dd7b6-cwmzr" event={"ID":"aa0343e2-73aa-4c17-8f7c-9835bdba9977","Type":"ContainerDied","Data":"6e3c423f62123e022affe69df503408f26c279989857772a1498204850e84433"} Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.391239 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "aa0343e2-73aa-4c17-8f7c-9835bdba9977" (UID: "aa0343e2-73aa-4c17-8f7c-9835bdba9977"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.398904 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5qhw" event={"ID":"9f5677a7-52b7-41f2-b50a-aabcca5dd27c","Type":"ContainerStarted","Data":"66d19425da46fc2df5d3e310fcc0e24a5b27140f10cc90f9f949424361d4f28b"} Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.405169 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data" (OuterVolumeSpecName: "config-data") pod "796d5724-e2a3-4fb6-9346-5a93b34e385a" (UID: "796d5724-e2a3-4fb6-9346-5a93b34e385a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.408260 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gmnqb" event={"ID":"1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70","Type":"ContainerDied","Data":"f99364a6fdc84dc17efcba3a50336142aaa5ef2b2c033d070bf08dbe7c8b1645"} Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.408476 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gmnqb" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.417869 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-config-data" (OuterVolumeSpecName: "config-data") pod "aa0343e2-73aa-4c17-8f7c-9835bdba9977" (UID: "aa0343e2-73aa-4c17-8f7c-9835bdba9977"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.445842 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa0343e2-73aa-4c17-8f7c-9835bdba9977" (UID: "aa0343e2-73aa-4c17-8f7c-9835bdba9977"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.459859 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "aa0343e2-73aa-4c17-8f7c-9835bdba9977" (UID: "aa0343e2-73aa-4c17-8f7c-9835bdba9977"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.471292 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-scripts" (OuterVolumeSpecName: "scripts") pod "aa0343e2-73aa-4c17-8f7c-9835bdba9977" (UID: "aa0343e2-73aa-4c17-8f7c-9835bdba9977"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.474231 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796d5724-e2a3-4fb6-9346-5a93b34e385a-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.474292 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.474307 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcjb8\" (UniqueName: \"kubernetes.io/projected/aa0343e2-73aa-4c17-8f7c-9835bdba9977-kube-api-access-fcjb8\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.474317 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.474326 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.474357 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa0343e2-73aa-4c17-8f7c-9835bdba9977-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.474371 4763 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa0343e2-73aa-4c17-8f7c-9835bdba9977-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.474382 4763 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa0343e2-73aa-4c17-8f7c-9835bdba9977-logs\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.533006 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmnqb"] Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.542817 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gmnqb"] Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.576109 4763 scope.go:117] "RemoveContainer" containerID="eb4643355b67ca85bf7fd1bc1bb6fe692b4f388fbc00b0a65593b3c080768642" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.612231 4763 scope.go:117] "RemoveContainer" containerID="133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.881489 4763 scope.go:117] "RemoveContainer" containerID="23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.952828 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.961980 4763 scope.go:117] "RemoveContainer" containerID="133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35" Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.962594 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35\": container with ID starting with 133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35 not found: ID does not exist" containerID="133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.962664 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35"} err="failed to get container status \"133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35\": rpc error: code = NotFound desc = could not find container \"133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35\": container with ID starting with 133b1815b5c6244c5d411d512d59f493d5f3acd42c81ff2ea93d4bc5c02d6d35 not found: ID does not exist" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.962689 4763 scope.go:117] "RemoveContainer" containerID="23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.962894 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.963405 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a\": container with ID starting with 23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a not found: ID does not exist" containerID="23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.963448 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a"} err="failed to get container status \"23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a\": rpc error: code = NotFound desc = could not find container \"23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a\": container with ID starting with 23d2607e9ee8fc1a6cdf8cd6175a6b13e5f53565819ae4f75bd632b5ebba293a not found: ID does not exist" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.963498 4763 scope.go:117] "RemoveContainer" containerID="0dc9c4197bff4aeed48696cdc704350957c2e21aec7155fcabc283de6d5f6e90" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.970612 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.971554 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerName="extract-utilities" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971571 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerName="extract-utilities" Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.971582 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerName="extract-content" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971588 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerName="extract-content" Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.971601 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971607 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon" Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.971630 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon-log" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971637 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon-log" Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.971650 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerName="registry-server" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971657 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerName="registry-server" Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.971667 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerName="probe" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971673 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerName="probe" Dec 01 10:11:34 crc kubenswrapper[4763]: E1201 10:11:34.971683 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerName="manila-scheduler" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971690 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerName="manila-scheduler" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971892 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerName="probe" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971926 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971936 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" containerName="registry-server" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971947 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" containerName="manila-scheduler" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.971955 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" containerName="horizon-log" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.973125 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.975419 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 01 10:11:34 crc kubenswrapper[4763]: I1201 10:11:34.981000 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79cd4dd7b6-cwmzr"] Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.013487 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70" path="/var/lib/kubelet/pods/1d904c0f-8eb4-4cb4-8904-18ceb0ab8e70/volumes" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.014206 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="796d5724-e2a3-4fb6-9346-5a93b34e385a" path="/var/lib/kubelet/pods/796d5724-e2a3-4fb6-9346-5a93b34e385a/volumes" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.014966 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79cd4dd7b6-cwmzr"] Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.014996 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.089377 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.089720 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-config-data\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.089857 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.089879 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-scripts\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.089955 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.090766 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj888\" (UniqueName: \"kubernetes.io/projected/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-kube-api-access-nj888\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.149290 4763 scope.go:117] "RemoveContainer" containerID="607666506f61f2b8a617c14b988aa367118eaea23b4c9b9d5a8bc7a7acedc3a2" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.169374 4763 scope.go:117] "RemoveContainer" containerID="9ce883adae2a2835b50f561d1c0e5d356ff155e2fd52c7fd1053b4070548f8e9" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.192941 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-config-data\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.193041 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.193066 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-scripts\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.193083 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.193122 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj888\" (UniqueName: \"kubernetes.io/projected/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-kube-api-access-nj888\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.193137 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.193218 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.199909 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-config-data\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.200396 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-scripts\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.201603 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.206815 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.210415 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj888\" (UniqueName: \"kubernetes.io/projected/3cf5f5cb-a8a0-40f4-8acc-1f41415052d2-kube-api-access-nj888\") pod \"manila-scheduler-0\" (UID: \"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2\") " pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.303971 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.707012 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 01 10:11:35 crc kubenswrapper[4763]: I1201 10:11:35.847996 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 01 10:11:36 crc kubenswrapper[4763]: I1201 10:11:36.429423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2","Type":"ContainerStarted","Data":"6667effd484c99fc40fb08e0327b9d0abc02850f29d7a2583914c69e44ef7cfb"} Dec 01 10:11:36 crc kubenswrapper[4763]: I1201 10:11:36.432866 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerID="66d19425da46fc2df5d3e310fcc0e24a5b27140f10cc90f9f949424361d4f28b" exitCode=0 Dec 01 10:11:36 crc kubenswrapper[4763]: I1201 10:11:36.432919 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5qhw" event={"ID":"9f5677a7-52b7-41f2-b50a-aabcca5dd27c","Type":"ContainerDied","Data":"66d19425da46fc2df5d3e310fcc0e24a5b27140f10cc90f9f949424361d4f28b"} Dec 01 10:11:37 crc kubenswrapper[4763]: I1201 10:11:37.007305 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0343e2-73aa-4c17-8f7c-9835bdba9977" path="/var/lib/kubelet/pods/aa0343e2-73aa-4c17-8f7c-9835bdba9977/volumes" Dec 01 10:11:37 crc kubenswrapper[4763]: I1201 10:11:37.402375 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 01 10:11:37 crc kubenswrapper[4763]: I1201 10:11:37.450892 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:37 crc kubenswrapper[4763]: I1201 10:11:37.451150 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="ee391497-0e27-412d-9209-a5c47226a435" containerName="manila-share" containerID="cri-o://a38d9a73157dfd349ab1871d5908adbd4ee5c88d4299d05c864baa9d8d7b8d2c" gracePeriod=30 Dec 01 10:11:37 crc kubenswrapper[4763]: I1201 10:11:37.451613 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="ee391497-0e27-412d-9209-a5c47226a435" containerName="probe" containerID="cri-o://8ffb02cab29352248be1ccce837f009618746c468d3723ded3f641a8568cb70d" gracePeriod=30 Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.461304 4763 generic.go:334] "Generic (PLEG): container finished" podID="ee391497-0e27-412d-9209-a5c47226a435" containerID="8ffb02cab29352248be1ccce837f009618746c468d3723ded3f641a8568cb70d" exitCode=0 Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.462037 4763 generic.go:334] "Generic (PLEG): container finished" podID="ee391497-0e27-412d-9209-a5c47226a435" containerID="a38d9a73157dfd349ab1871d5908adbd4ee5c88d4299d05c864baa9d8d7b8d2c" exitCode=1 Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.461404 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ee391497-0e27-412d-9209-a5c47226a435","Type":"ContainerDied","Data":"8ffb02cab29352248be1ccce837f009618746c468d3723ded3f641a8568cb70d"} Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.462104 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ee391497-0e27-412d-9209-a5c47226a435","Type":"ContainerDied","Data":"a38d9a73157dfd349ab1871d5908adbd4ee5c88d4299d05c864baa9d8d7b8d2c"} Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.464078 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2","Type":"ContainerStarted","Data":"c88499325a00ae8d2b486afafa37c0875da0e7e80dc3d0eaa79d9753284fff3b"} Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.466356 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5qhw" event={"ID":"9f5677a7-52b7-41f2-b50a-aabcca5dd27c","Type":"ContainerStarted","Data":"f7d0659b8bf94921bbf5a75b0d4b553d21ed98f63fce659a55bb72aee3c36b52"} Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.499533 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5qhw" podStartSLOduration=3.879889809 podStartE2EDuration="21.499507717s" podCreationTimestamp="2025-12-01 10:11:17 +0000 UTC" firstStartedPulling="2025-12-01 10:11:20.154083144 +0000 UTC m=+3397.422731912" lastFinishedPulling="2025-12-01 10:11:37.773701052 +0000 UTC m=+3415.042349820" observedRunningTime="2025-12-01 10:11:38.488372292 +0000 UTC m=+3415.757021060" watchObservedRunningTime="2025-12-01 10:11:38.499507717 +0000 UTC m=+3415.768156505" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.677916 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.762385 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-combined-ca-bundle\") pod \"ee391497-0e27-412d-9209-a5c47226a435\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.762650 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt26k\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-kube-api-access-tt26k\") pod \"ee391497-0e27-412d-9209-a5c47226a435\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.762734 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data\") pod \"ee391497-0e27-412d-9209-a5c47226a435\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.762843 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-var-lib-manila\") pod \"ee391497-0e27-412d-9209-a5c47226a435\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.762921 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data-custom\") pod \"ee391497-0e27-412d-9209-a5c47226a435\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.762997 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-ceph\") pod \"ee391497-0e27-412d-9209-a5c47226a435\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.763132 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-etc-machine-id\") pod \"ee391497-0e27-412d-9209-a5c47226a435\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.763212 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-scripts\") pod \"ee391497-0e27-412d-9209-a5c47226a435\" (UID: \"ee391497-0e27-412d-9209-a5c47226a435\") " Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.763382 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "ee391497-0e27-412d-9209-a5c47226a435" (UID: "ee391497-0e27-412d-9209-a5c47226a435"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.763750 4763 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-var-lib-manila\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.772566 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-scripts" (OuterVolumeSpecName: "scripts") pod "ee391497-0e27-412d-9209-a5c47226a435" (UID: "ee391497-0e27-412d-9209-a5c47226a435"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.785729 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ee391497-0e27-412d-9209-a5c47226a435" (UID: "ee391497-0e27-412d-9209-a5c47226a435"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.786276 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-kube-api-access-tt26k" (OuterVolumeSpecName: "kube-api-access-tt26k") pod "ee391497-0e27-412d-9209-a5c47226a435" (UID: "ee391497-0e27-412d-9209-a5c47226a435"). InnerVolumeSpecName "kube-api-access-tt26k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.802411 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-ceph" (OuterVolumeSpecName: "ceph") pod "ee391497-0e27-412d-9209-a5c47226a435" (UID: "ee391497-0e27-412d-9209-a5c47226a435"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.807795 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee391497-0e27-412d-9209-a5c47226a435" (UID: "ee391497-0e27-412d-9209-a5c47226a435"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.866567 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt26k\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-kube-api-access-tt26k\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.866612 4763 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.866621 4763 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ee391497-0e27-412d-9209-a5c47226a435-ceph\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.866630 4763 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ee391497-0e27-412d-9209-a5c47226a435-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.866638 4763 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-scripts\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.868809 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee391497-0e27-412d-9209-a5c47226a435" (UID: "ee391497-0e27-412d-9209-a5c47226a435"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.939504 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data" (OuterVolumeSpecName: "config-data") pod "ee391497-0e27-412d-9209-a5c47226a435" (UID: "ee391497-0e27-412d-9209-a5c47226a435"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.967816 4763 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:38 crc kubenswrapper[4763]: I1201 10:11:38.967843 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee391497-0e27-412d-9209-a5c47226a435-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.478942 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"3cf5f5cb-a8a0-40f4-8acc-1f41415052d2","Type":"ContainerStarted","Data":"790604523a1c4242da156903eebbeaef02e5ebecac8a8e3a54f1fc58da524422"} Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.482212 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ee391497-0e27-412d-9209-a5c47226a435","Type":"ContainerDied","Data":"917d60a72c1990814f2d1a0b6bf444c5e2fa5d61c3a7ec2c7794316eec28c367"} Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.482250 4763 scope.go:117] "RemoveContainer" containerID="8ffb02cab29352248be1ccce837f009618746c468d3723ded3f641a8568cb70d" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.482288 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.505708 4763 scope.go:117] "RemoveContainer" containerID="a38d9a73157dfd349ab1871d5908adbd4ee5c88d4299d05c864baa9d8d7b8d2c" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.516648 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=5.516624203 podStartE2EDuration="5.516624203s" podCreationTimestamp="2025-12-01 10:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:11:39.506144016 +0000 UTC m=+3416.774792794" watchObservedRunningTime="2025-12-01 10:11:39.516624203 +0000 UTC m=+3416.785272971" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.546303 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.562013 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.576616 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:39 crc kubenswrapper[4763]: E1201 10:11:39.577243 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee391497-0e27-412d-9209-a5c47226a435" containerName="manila-share" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.577307 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee391497-0e27-412d-9209-a5c47226a435" containerName="manila-share" Dec 01 10:11:39 crc kubenswrapper[4763]: E1201 10:11:39.577414 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee391497-0e27-412d-9209-a5c47226a435" containerName="probe" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.577483 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee391497-0e27-412d-9209-a5c47226a435" containerName="probe" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.577737 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee391497-0e27-412d-9209-a5c47226a435" containerName="manila-share" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.577822 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee391497-0e27-412d-9209-a5c47226a435" containerName="probe" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.579068 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.581094 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.604179 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.680191 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18c3fc4b-5681-40d1-8b65-e85af0a1905e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.680265 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.680292 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-config-data\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.680347 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/18c3fc4b-5681-40d1-8b65-e85af0a1905e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.680365 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.680432 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p546s\" (UniqueName: \"kubernetes.io/projected/18c3fc4b-5681-40d1-8b65-e85af0a1905e-kube-api-access-p546s\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.680469 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-scripts\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.680493 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/18c3fc4b-5681-40d1-8b65-e85af0a1905e-ceph\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.782727 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.782877 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p546s\" (UniqueName: \"kubernetes.io/projected/18c3fc4b-5681-40d1-8b65-e85af0a1905e-kube-api-access-p546s\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.782913 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-scripts\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.782941 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/18c3fc4b-5681-40d1-8b65-e85af0a1905e-ceph\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.783038 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18c3fc4b-5681-40d1-8b65-e85af0a1905e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.783081 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.783104 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-config-data\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.783143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/18c3fc4b-5681-40d1-8b65-e85af0a1905e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.783271 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/18c3fc4b-5681-40d1-8b65-e85af0a1905e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.783848 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18c3fc4b-5681-40d1-8b65-e85af0a1905e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.788108 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.788187 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/18c3fc4b-5681-40d1-8b65-e85af0a1905e-ceph\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.789073 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.794366 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-config-data\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.805900 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p546s\" (UniqueName: \"kubernetes.io/projected/18c3fc4b-5681-40d1-8b65-e85af0a1905e-kube-api-access-p546s\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.805918 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c3fc4b-5681-40d1-8b65-e85af0a1905e-scripts\") pod \"manila-share-share1-0\" (UID: \"18c3fc4b-5681-40d1-8b65-e85af0a1905e\") " pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.905727 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 01 10:11:39 crc kubenswrapper[4763]: I1201 10:11:39.995377 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:11:40 crc kubenswrapper[4763]: I1201 10:11:40.504182 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"17efb0e546f7d9b6580631e7298f5622be7778a3f4d53f06a30f0c7d41d4ce13"} Dec 01 10:11:40 crc kubenswrapper[4763]: W1201 10:11:40.716859 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18c3fc4b_5681_40d1_8b65_e85af0a1905e.slice/crio-ca96cd1eebe20d4a44186d374ff298d9468c90db9a9a3356265a052d4b9e779c WatchSource:0}: Error finding container ca96cd1eebe20d4a44186d374ff298d9468c90db9a9a3356265a052d4b9e779c: Status 404 returned error can't find the container with id ca96cd1eebe20d4a44186d374ff298d9468c90db9a9a3356265a052d4b9e779c Dec 01 10:11:40 crc kubenswrapper[4763]: I1201 10:11:40.723594 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 01 10:11:41 crc kubenswrapper[4763]: I1201 10:11:41.009173 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee391497-0e27-412d-9209-a5c47226a435" path="/var/lib/kubelet/pods/ee391497-0e27-412d-9209-a5c47226a435/volumes" Dec 01 10:11:41 crc kubenswrapper[4763]: I1201 10:11:41.522415 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"18c3fc4b-5681-40d1-8b65-e85af0a1905e","Type":"ContainerStarted","Data":"ca96cd1eebe20d4a44186d374ff298d9468c90db9a9a3356265a052d4b9e779c"} Dec 01 10:11:42 crc kubenswrapper[4763]: I1201 10:11:42.533662 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"18c3fc4b-5681-40d1-8b65-e85af0a1905e","Type":"ContainerStarted","Data":"c886fee75250717389e58c802ccd4534712a4d9ef3071a92b755e38e4e526b82"} Dec 01 10:11:42 crc kubenswrapper[4763]: I1201 10:11:42.534096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"18c3fc4b-5681-40d1-8b65-e85af0a1905e","Type":"ContainerStarted","Data":"0e8bc7b6915dd49a03db626129ddd33a3d9bc520e0c8382a0ad058bc220dfbf2"} Dec 01 10:11:42 crc kubenswrapper[4763]: I1201 10:11:42.570830 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.570806159 podStartE2EDuration="3.570806159s" podCreationTimestamp="2025-12-01 10:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:11:42.564539318 +0000 UTC m=+3419.833188096" watchObservedRunningTime="2025-12-01 10:11:42.570806159 +0000 UTC m=+3419.839454927" Dec 01 10:11:45 crc kubenswrapper[4763]: I1201 10:11:45.304085 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 01 10:11:48 crc kubenswrapper[4763]: I1201 10:11:48.341830 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:48 crc kubenswrapper[4763]: I1201 10:11:48.342485 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:49 crc kubenswrapper[4763]: I1201 10:11:49.421594 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w5qhw" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="registry-server" probeResult="failure" output=< Dec 01 10:11:49 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 10:11:49 crc kubenswrapper[4763]: > Dec 01 10:11:49 crc kubenswrapper[4763]: I1201 10:11:49.906782 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 01 10:11:52 crc kubenswrapper[4763]: I1201 10:11:52.634572 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 01 10:11:57 crc kubenswrapper[4763]: I1201 10:11:57.008512 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 01 10:11:58 crc kubenswrapper[4763]: I1201 10:11:58.399928 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:58 crc kubenswrapper[4763]: I1201 10:11:58.455372 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:11:58 crc kubenswrapper[4763]: I1201 10:11:58.523362 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5qhw"] Dec 01 10:11:58 crc kubenswrapper[4763]: I1201 10:11:58.640918 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkpnv"] Dec 01 10:11:58 crc kubenswrapper[4763]: I1201 10:11:58.641229 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkpnv" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" containerName="registry-server" containerID="cri-o://b9f7d7429fafb5c44e334969e332b7a533980dea8d8a59c154298f44997d6bd9" gracePeriod=2 Dec 01 10:11:58 crc kubenswrapper[4763]: I1201 10:11:58.837416 4763 generic.go:334] "Generic (PLEG): container finished" podID="ff859352-a99d-4a67-9126-ec6a056b3236" containerID="b9f7d7429fafb5c44e334969e332b7a533980dea8d8a59c154298f44997d6bd9" exitCode=0 Dec 01 10:11:58 crc kubenswrapper[4763]: I1201 10:11:58.837736 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkpnv" event={"ID":"ff859352-a99d-4a67-9126-ec6a056b3236","Type":"ContainerDied","Data":"b9f7d7429fafb5c44e334969e332b7a533980dea8d8a59c154298f44997d6bd9"} Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.178352 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.316079 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-catalog-content\") pod \"ff859352-a99d-4a67-9126-ec6a056b3236\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.316235 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8q7q\" (UniqueName: \"kubernetes.io/projected/ff859352-a99d-4a67-9126-ec6a056b3236-kube-api-access-h8q7q\") pod \"ff859352-a99d-4a67-9126-ec6a056b3236\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.316322 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-utilities\") pod \"ff859352-a99d-4a67-9126-ec6a056b3236\" (UID: \"ff859352-a99d-4a67-9126-ec6a056b3236\") " Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.319162 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-utilities" (OuterVolumeSpecName: "utilities") pod "ff859352-a99d-4a67-9126-ec6a056b3236" (UID: "ff859352-a99d-4a67-9126-ec6a056b3236"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.335361 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff859352-a99d-4a67-9126-ec6a056b3236-kube-api-access-h8q7q" (OuterVolumeSpecName: "kube-api-access-h8q7q") pod "ff859352-a99d-4a67-9126-ec6a056b3236" (UID: "ff859352-a99d-4a67-9126-ec6a056b3236"). InnerVolumeSpecName "kube-api-access-h8q7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.419732 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8q7q\" (UniqueName: \"kubernetes.io/projected/ff859352-a99d-4a67-9126-ec6a056b3236-kube-api-access-h8q7q\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.420801 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.465969 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff859352-a99d-4a67-9126-ec6a056b3236" (UID: "ff859352-a99d-4a67-9126-ec6a056b3236"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.523158 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff859352-a99d-4a67-9126-ec6a056b3236-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.849416 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkpnv" event={"ID":"ff859352-a99d-4a67-9126-ec6a056b3236","Type":"ContainerDied","Data":"2a58ab9533de7aea397bf5b3ce3f5cbb58896123a3a23980e0498490220bb553"} Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.849449 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkpnv" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.849507 4763 scope.go:117] "RemoveContainer" containerID="b9f7d7429fafb5c44e334969e332b7a533980dea8d8a59c154298f44997d6bd9" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.886507 4763 scope.go:117] "RemoveContainer" containerID="c4e5c01e829785d3129ebb69ce64dba2c1b7e9f1450fa9bb895cffb5a0e5fcff" Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.901689 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkpnv"] Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.915319 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkpnv"] Dec 01 10:11:59 crc kubenswrapper[4763]: I1201 10:11:59.929436 4763 scope.go:117] "RemoveContainer" containerID="2175e774b58a169cb873945fb1b793f2cdbf5d40c186c621572258c9b2c969a0" Dec 01 10:12:01 crc kubenswrapper[4763]: I1201 10:12:01.008164 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" path="/var/lib/kubelet/pods/ff859352-a99d-4a67-9126-ec6a056b3236/volumes" Dec 01 10:12:01 crc kubenswrapper[4763]: I1201 10:12:01.649557 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.055233 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 10:13:06 crc kubenswrapper[4763]: E1201 10:13:06.056701 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" containerName="extract-content" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.056734 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" containerName="extract-content" Dec 01 10:13:06 crc kubenswrapper[4763]: E1201 10:13:06.056765 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" containerName="registry-server" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.056779 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" containerName="registry-server" Dec 01 10:13:06 crc kubenswrapper[4763]: E1201 10:13:06.056810 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" containerName="extract-utilities" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.056824 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" containerName="extract-utilities" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.057198 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff859352-a99d-4a67-9126-ec6a056b3236" containerName="registry-server" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.058792 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.062512 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.062781 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.063087 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.073551 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.075667 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kfvgw" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.129330 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.129399 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.129632 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-config-data\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.231378 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t49vh\" (UniqueName: \"kubernetes.io/projected/74d118c3-e544-4a7a-ad22-de496e16f9ee-kube-api-access-t49vh\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.231717 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.231752 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.231796 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.231841 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.231864 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.231913 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.231999 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-config-data\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.232036 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.233371 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.233589 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-config-data\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.254958 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.337081 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t49vh\" (UniqueName: \"kubernetes.io/projected/74d118c3-e544-4a7a-ad22-de496e16f9ee-kube-api-access-t49vh\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.337160 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.337189 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.337229 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.337278 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.337384 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.338267 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.338814 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.340403 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.343093 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.346941 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.356836 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t49vh\" (UniqueName: \"kubernetes.io/projected/74d118c3-e544-4a7a-ad22-de496e16f9ee-kube-api-access-t49vh\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.377380 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.398749 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:13:06 crc kubenswrapper[4763]: I1201 10:13:06.885510 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 01 10:13:07 crc kubenswrapper[4763]: I1201 10:13:07.483943 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"74d118c3-e544-4a7a-ad22-de496e16f9ee","Type":"ContainerStarted","Data":"a043188c8ae77975dd6aa4d3b81ea337abe359a7934a4b50450abf85dec3f556"} Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.194033 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jtv7h"] Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.196549 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.215034 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jtv7h"] Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.261557 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-catalog-content\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.261892 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjnjj\" (UniqueName: \"kubernetes.io/projected/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-kube-api-access-jjnjj\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.261919 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-utilities\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.363439 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjnjj\" (UniqueName: \"kubernetes.io/projected/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-kube-api-access-jjnjj\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.363503 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-utilities\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.363547 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-catalog-content\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.364167 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-catalog-content\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.364256 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-utilities\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.384894 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjnjj\" (UniqueName: \"kubernetes.io/projected/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-kube-api-access-jjnjj\") pod \"certified-operators-jtv7h\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:35 crc kubenswrapper[4763]: I1201 10:13:35.543097 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:13:50 crc kubenswrapper[4763]: E1201 10:13:50.925706 4763 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 01 10:13:50 crc kubenswrapper[4763]: E1201 10:13:50.930774 4763 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t49vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(74d118c3-e544-4a7a-ad22-de496e16f9ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 01 10:13:50 crc kubenswrapper[4763]: E1201 10:13:50.932849 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="74d118c3-e544-4a7a-ad22-de496e16f9ee" Dec 01 10:13:51 crc kubenswrapper[4763]: I1201 10:13:51.363983 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jtv7h"] Dec 01 10:13:51 crc kubenswrapper[4763]: I1201 10:13:51.906835 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerID="527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e" exitCode=0 Dec 01 10:13:51 crc kubenswrapper[4763]: I1201 10:13:51.907091 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jtv7h" event={"ID":"4f8ca9e8-9c20-41a5-a939-38d6276bbd36","Type":"ContainerDied","Data":"527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e"} Dec 01 10:13:51 crc kubenswrapper[4763]: I1201 10:13:51.907235 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jtv7h" event={"ID":"4f8ca9e8-9c20-41a5-a939-38d6276bbd36","Type":"ContainerStarted","Data":"05600fbb3dbb4f43f4e845a646bd074973a56db20a4b06c886b976b785166dea"} Dec 01 10:13:51 crc kubenswrapper[4763]: E1201 10:13:51.909137 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="74d118c3-e544-4a7a-ad22-de496e16f9ee" Dec 01 10:13:53 crc kubenswrapper[4763]: I1201 10:13:53.926973 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jtv7h" event={"ID":"4f8ca9e8-9c20-41a5-a939-38d6276bbd36","Type":"ContainerStarted","Data":"f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db"} Dec 01 10:13:54 crc kubenswrapper[4763]: I1201 10:13:54.936211 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerID="f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db" exitCode=0 Dec 01 10:13:54 crc kubenswrapper[4763]: I1201 10:13:54.936280 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jtv7h" event={"ID":"4f8ca9e8-9c20-41a5-a939-38d6276bbd36","Type":"ContainerDied","Data":"f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db"} Dec 01 10:13:55 crc kubenswrapper[4763]: I1201 10:13:55.948305 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jtv7h" event={"ID":"4f8ca9e8-9c20-41a5-a939-38d6276bbd36","Type":"ContainerStarted","Data":"b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec"} Dec 01 10:13:55 crc kubenswrapper[4763]: I1201 10:13:55.978264 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jtv7h" podStartSLOduration=17.443625991 podStartE2EDuration="20.978242019s" podCreationTimestamp="2025-12-01 10:13:35 +0000 UTC" firstStartedPulling="2025-12-01 10:13:51.909094526 +0000 UTC m=+3549.177743334" lastFinishedPulling="2025-12-01 10:13:55.443710594 +0000 UTC m=+3552.712359362" observedRunningTime="2025-12-01 10:13:55.967826724 +0000 UTC m=+3553.236475512" watchObservedRunningTime="2025-12-01 10:13:55.978242019 +0000 UTC m=+3553.246890797" Dec 01 10:14:01 crc kubenswrapper[4763]: I1201 10:14:01.262410 4763 patch_prober.go:28] interesting pod/route-controller-manager-7656c99f76-hm47k container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 01 10:14:01 crc kubenswrapper[4763]: I1201 10:14:01.263444 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7656c99f76-hm47k" podUID="cbc4681b-c41e-4ffa-b884-bf63d1b4147e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 01 10:14:03 crc kubenswrapper[4763]: I1201 10:14:03.929755 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:14:03 crc kubenswrapper[4763]: I1201 10:14:03.930359 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:14:05 crc kubenswrapper[4763]: I1201 10:14:05.544127 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:14:05 crc kubenswrapper[4763]: I1201 10:14:05.544188 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:14:05 crc kubenswrapper[4763]: I1201 10:14:05.628259 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:14:06 crc kubenswrapper[4763]: I1201 10:14:06.086846 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:14:06 crc kubenswrapper[4763]: I1201 10:14:06.133593 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jtv7h"] Dec 01 10:14:07 crc kubenswrapper[4763]: I1201 10:14:07.597763 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.068667 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jtv7h" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerName="registry-server" containerID="cri-o://b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec" gracePeriod=2 Dec 01 10:14:08 crc kubenswrapper[4763]: E1201 10:14:08.169760 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f8ca9e8_9c20_41a5_a939_38d6276bbd36.slice/crio-conmon-b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.616331 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.766201 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjnjj\" (UniqueName: \"kubernetes.io/projected/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-kube-api-access-jjnjj\") pod \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.766376 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-utilities\") pod \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.767248 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-utilities" (OuterVolumeSpecName: "utilities") pod "4f8ca9e8-9c20-41a5-a939-38d6276bbd36" (UID: "4f8ca9e8-9c20-41a5-a939-38d6276bbd36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.767548 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-catalog-content\") pod \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\" (UID: \"4f8ca9e8-9c20-41a5-a939-38d6276bbd36\") " Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.772119 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-kube-api-access-jjnjj" (OuterVolumeSpecName: "kube-api-access-jjnjj") pod "4f8ca9e8-9c20-41a5-a939-38d6276bbd36" (UID: "4f8ca9e8-9c20-41a5-a939-38d6276bbd36"). InnerVolumeSpecName "kube-api-access-jjnjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.776893 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjnjj\" (UniqueName: \"kubernetes.io/projected/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-kube-api-access-jjnjj\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.776923 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.829155 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f8ca9e8-9c20-41a5-a939-38d6276bbd36" (UID: "4f8ca9e8-9c20-41a5-a939-38d6276bbd36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:14:08 crc kubenswrapper[4763]: I1201 10:14:08.878363 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f8ca9e8-9c20-41a5-a939-38d6276bbd36-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.079196 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"74d118c3-e544-4a7a-ad22-de496e16f9ee","Type":"ContainerStarted","Data":"1e41c8399655877e1c581b04d2303ad6b82cd87dc49b44233448dbc0ab8abcd4"} Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.081348 4763 generic.go:334] "Generic (PLEG): container finished" podID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerID="b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec" exitCode=0 Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.081372 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jtv7h" event={"ID":"4f8ca9e8-9c20-41a5-a939-38d6276bbd36","Type":"ContainerDied","Data":"b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec"} Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.081396 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jtv7h" event={"ID":"4f8ca9e8-9c20-41a5-a939-38d6276bbd36","Type":"ContainerDied","Data":"05600fbb3dbb4f43f4e845a646bd074973a56db20a4b06c886b976b785166dea"} Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.081413 4763 scope.go:117] "RemoveContainer" containerID="b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.081359 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jtv7h" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.103025 4763 scope.go:117] "RemoveContainer" containerID="f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.115926 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.420219399 podStartE2EDuration="1m4.115900644s" podCreationTimestamp="2025-12-01 10:13:05 +0000 UTC" firstStartedPulling="2025-12-01 10:13:06.898395245 +0000 UTC m=+3504.167044033" lastFinishedPulling="2025-12-01 10:14:07.59407651 +0000 UTC m=+3564.862725278" observedRunningTime="2025-12-01 10:14:09.102505897 +0000 UTC m=+3566.371154705" watchObservedRunningTime="2025-12-01 10:14:09.115900644 +0000 UTC m=+3566.384549422" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.131239 4763 scope.go:117] "RemoveContainer" containerID="527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.135532 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jtv7h"] Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.143823 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jtv7h"] Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.180075 4763 scope.go:117] "RemoveContainer" containerID="b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec" Dec 01 10:14:09 crc kubenswrapper[4763]: E1201 10:14:09.180607 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec\": container with ID starting with b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec not found: ID does not exist" containerID="b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.180638 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec"} err="failed to get container status \"b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec\": rpc error: code = NotFound desc = could not find container \"b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec\": container with ID starting with b6de1ce799b1839f9095bfd33359aabe4424dbc00e5e4cbc7c8bafce69df97ec not found: ID does not exist" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.180661 4763 scope.go:117] "RemoveContainer" containerID="f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db" Dec 01 10:14:09 crc kubenswrapper[4763]: E1201 10:14:09.180973 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db\": container with ID starting with f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db not found: ID does not exist" containerID="f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.180996 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db"} err="failed to get container status \"f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db\": rpc error: code = NotFound desc = could not find container \"f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db\": container with ID starting with f52944dfee192ec8e2aeb91fdbd20897026bc44f096abde5def02b1a860fe7db not found: ID does not exist" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.181009 4763 scope.go:117] "RemoveContainer" containerID="527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e" Dec 01 10:14:09 crc kubenswrapper[4763]: E1201 10:14:09.181404 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e\": container with ID starting with 527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e not found: ID does not exist" containerID="527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e" Dec 01 10:14:09 crc kubenswrapper[4763]: I1201 10:14:09.181534 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e"} err="failed to get container status \"527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e\": rpc error: code = NotFound desc = could not find container \"527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e\": container with ID starting with 527ac9e982484a78a94a7184a68f72cc5fc479c50f323dd3ac03a95673379b7e not found: ID does not exist" Dec 01 10:14:11 crc kubenswrapper[4763]: I1201 10:14:11.007394 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" path="/var/lib/kubelet/pods/4f8ca9e8-9c20-41a5-a939-38d6276bbd36/volumes" Dec 01 10:14:33 crc kubenswrapper[4763]: I1201 10:14:33.929745 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:14:33 crc kubenswrapper[4763]: I1201 10:14:33.931141 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.184693 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq"] Dec 01 10:15:00 crc kubenswrapper[4763]: E1201 10:15:00.185440 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerName="extract-content" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.185540 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerName="extract-content" Dec 01 10:15:00 crc kubenswrapper[4763]: E1201 10:15:00.185548 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerName="registry-server" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.185554 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerName="registry-server" Dec 01 10:15:00 crc kubenswrapper[4763]: E1201 10:15:00.185588 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerName="extract-utilities" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.185598 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerName="extract-utilities" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.185937 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8ca9e8-9c20-41a5-a939-38d6276bbd36" containerName="registry-server" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.186559 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.190046 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.190552 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.196052 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq"] Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.256775 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/028f3f7d-3fc7-477b-af14-03d248b6e9a7-config-volume\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.257137 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/028f3f7d-3fc7-477b-af14-03d248b6e9a7-secret-volume\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.257247 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq8qx\" (UniqueName: \"kubernetes.io/projected/028f3f7d-3fc7-477b-af14-03d248b6e9a7-kube-api-access-lq8qx\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.359266 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/028f3f7d-3fc7-477b-af14-03d248b6e9a7-config-volume\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.359510 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/028f3f7d-3fc7-477b-af14-03d248b6e9a7-secret-volume\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.359565 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq8qx\" (UniqueName: \"kubernetes.io/projected/028f3f7d-3fc7-477b-af14-03d248b6e9a7-kube-api-access-lq8qx\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.360272 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/028f3f7d-3fc7-477b-af14-03d248b6e9a7-config-volume\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.368962 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/028f3f7d-3fc7-477b-af14-03d248b6e9a7-secret-volume\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.378773 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq8qx\" (UniqueName: \"kubernetes.io/projected/028f3f7d-3fc7-477b-af14-03d248b6e9a7-kube-api-access-lq8qx\") pod \"collect-profiles-29409735-dj8fq\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.508237 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vjr8n"] Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.517521 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.522355 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.566522 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-utilities\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.571527 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq9wj\" (UniqueName: \"kubernetes.io/projected/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-kube-api-access-pq9wj\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.572058 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-catalog-content\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.578353 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vjr8n"] Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.674505 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-catalog-content\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.675074 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-catalog-content\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.675519 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-utilities\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.675644 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-utilities\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.675780 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq9wj\" (UniqueName: \"kubernetes.io/projected/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-kube-api-access-pq9wj\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.713709 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq9wj\" (UniqueName: \"kubernetes.io/projected/26163ccc-7fc4-4baa-9bf0-7ca523c888ea-kube-api-access-pq9wj\") pod \"community-operators-vjr8n\" (UID: \"26163ccc-7fc4-4baa-9bf0-7ca523c888ea\") " pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:00 crc kubenswrapper[4763]: I1201 10:15:00.895611 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:01 crc kubenswrapper[4763]: I1201 10:15:01.133138 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq"] Dec 01 10:15:01 crc kubenswrapper[4763]: I1201 10:15:01.548663 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vjr8n"] Dec 01 10:15:01 crc kubenswrapper[4763]: I1201 10:15:01.621631 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjr8n" event={"ID":"26163ccc-7fc4-4baa-9bf0-7ca523c888ea","Type":"ContainerStarted","Data":"7a3941a68c14a0be81cba3ccaa2854b17b53221bbe71e4ad68bcba69bb856d3b"} Dec 01 10:15:01 crc kubenswrapper[4763]: I1201 10:15:01.624133 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" event={"ID":"028f3f7d-3fc7-477b-af14-03d248b6e9a7","Type":"ContainerStarted","Data":"e21b3e5fcef670b0771076238f063f0e5731c3828f149f129eba5cb19fc26222"} Dec 01 10:15:01 crc kubenswrapper[4763]: I1201 10:15:01.624213 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" event={"ID":"028f3f7d-3fc7-477b-af14-03d248b6e9a7","Type":"ContainerStarted","Data":"133465c963e39ebe2bcedaaacccf686e6f1b19c6a77b485498ead7a78714e0c5"} Dec 01 10:15:01 crc kubenswrapper[4763]: I1201 10:15:01.656709 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" podStartSLOduration=1.656688225 podStartE2EDuration="1.656688225s" podCreationTimestamp="2025-12-01 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:15:01.652535382 +0000 UTC m=+3618.921184150" watchObservedRunningTime="2025-12-01 10:15:01.656688225 +0000 UTC m=+3618.925336983" Dec 01 10:15:02 crc kubenswrapper[4763]: I1201 10:15:02.648033 4763 generic.go:334] "Generic (PLEG): container finished" podID="028f3f7d-3fc7-477b-af14-03d248b6e9a7" containerID="e21b3e5fcef670b0771076238f063f0e5731c3828f149f129eba5cb19fc26222" exitCode=0 Dec 01 10:15:02 crc kubenswrapper[4763]: I1201 10:15:02.648600 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" event={"ID":"028f3f7d-3fc7-477b-af14-03d248b6e9a7","Type":"ContainerDied","Data":"e21b3e5fcef670b0771076238f063f0e5731c3828f149f129eba5cb19fc26222"} Dec 01 10:15:02 crc kubenswrapper[4763]: I1201 10:15:02.657219 4763 generic.go:334] "Generic (PLEG): container finished" podID="26163ccc-7fc4-4baa-9bf0-7ca523c888ea" containerID="a88c1726abc7d9d13580a03cd5460280d651fbba0f44247bd307c9df1b583037" exitCode=0 Dec 01 10:15:02 crc kubenswrapper[4763]: I1201 10:15:02.657272 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjr8n" event={"ID":"26163ccc-7fc4-4baa-9bf0-7ca523c888ea","Type":"ContainerDied","Data":"a88c1726abc7d9d13580a03cd5460280d651fbba0f44247bd307c9df1b583037"} Dec 01 10:15:03 crc kubenswrapper[4763]: I1201 10:15:03.930191 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:15:03 crc kubenswrapper[4763]: I1201 10:15:03.930652 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:15:03 crc kubenswrapper[4763]: I1201 10:15:03.930750 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 10:15:03 crc kubenswrapper[4763]: I1201 10:15:03.931563 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17efb0e546f7d9b6580631e7298f5622be7778a3f4d53f06a30f0c7d41d4ce13"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:15:03 crc kubenswrapper[4763]: I1201 10:15:03.931880 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://17efb0e546f7d9b6580631e7298f5622be7778a3f4d53f06a30f0c7d41d4ce13" gracePeriod=600 Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.113317 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.162090 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/028f3f7d-3fc7-477b-af14-03d248b6e9a7-secret-volume\") pod \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.162226 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq8qx\" (UniqueName: \"kubernetes.io/projected/028f3f7d-3fc7-477b-af14-03d248b6e9a7-kube-api-access-lq8qx\") pod \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.162266 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/028f3f7d-3fc7-477b-af14-03d248b6e9a7-config-volume\") pod \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\" (UID: \"028f3f7d-3fc7-477b-af14-03d248b6e9a7\") " Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.163554 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/028f3f7d-3fc7-477b-af14-03d248b6e9a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "028f3f7d-3fc7-477b-af14-03d248b6e9a7" (UID: "028f3f7d-3fc7-477b-af14-03d248b6e9a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.169723 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/028f3f7d-3fc7-477b-af14-03d248b6e9a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "028f3f7d-3fc7-477b-af14-03d248b6e9a7" (UID: "028f3f7d-3fc7-477b-af14-03d248b6e9a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.169762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028f3f7d-3fc7-477b-af14-03d248b6e9a7-kube-api-access-lq8qx" (OuterVolumeSpecName: "kube-api-access-lq8qx") pod "028f3f7d-3fc7-477b-af14-03d248b6e9a7" (UID: "028f3f7d-3fc7-477b-af14-03d248b6e9a7"). InnerVolumeSpecName "kube-api-access-lq8qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.264955 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/028f3f7d-3fc7-477b-af14-03d248b6e9a7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.265284 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq8qx\" (UniqueName: \"kubernetes.io/projected/028f3f7d-3fc7-477b-af14-03d248b6e9a7-kube-api-access-lq8qx\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.265294 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/028f3f7d-3fc7-477b-af14-03d248b6e9a7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.677501 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.677515 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409735-dj8fq" event={"ID":"028f3f7d-3fc7-477b-af14-03d248b6e9a7","Type":"ContainerDied","Data":"133465c963e39ebe2bcedaaacccf686e6f1b19c6a77b485498ead7a78714e0c5"} Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.677819 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="133465c963e39ebe2bcedaaacccf686e6f1b19c6a77b485498ead7a78714e0c5" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.702001 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="17efb0e546f7d9b6580631e7298f5622be7778a3f4d53f06a30f0c7d41d4ce13" exitCode=0 Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.702261 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"17efb0e546f7d9b6580631e7298f5622be7778a3f4d53f06a30f0c7d41d4ce13"} Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.702296 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6"} Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.702313 4763 scope.go:117] "RemoveContainer" containerID="25e9e6c19b8eb54e9e27eeb75359affa5d825efae3bc0b0c7187aa8e7a3c99eb" Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.779739 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m"] Dec 01 10:15:04 crc kubenswrapper[4763]: I1201 10:15:04.790120 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409690-drv5m"] Dec 01 10:15:05 crc kubenswrapper[4763]: I1201 10:15:05.013365 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aa0ec16-0f6b-4b6b-894c-31c949e95498" path="/var/lib/kubelet/pods/5aa0ec16-0f6b-4b6b-894c-31c949e95498/volumes" Dec 01 10:15:05 crc kubenswrapper[4763]: I1201 10:15:05.983210 4763 scope.go:117] "RemoveContainer" containerID="c46e6b46de1606f5513d6221c403d4c8ef18650f577951ebee6572d3046aa5a0" Dec 01 10:15:09 crc kubenswrapper[4763]: I1201 10:15:09.763339 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjr8n" event={"ID":"26163ccc-7fc4-4baa-9bf0-7ca523c888ea","Type":"ContainerStarted","Data":"e10e201926ca2ee905bd41163b2570aaab40e4f3af8fe96a649f8ac826104d28"} Dec 01 10:15:09 crc kubenswrapper[4763]: E1201 10:15:09.800428 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26163ccc_7fc4_4baa_9bf0_7ca523c888ea.slice/crio-e10e201926ca2ee905bd41163b2570aaab40e4f3af8fe96a649f8ac826104d28.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:15:10 crc kubenswrapper[4763]: I1201 10:15:10.773212 4763 generic.go:334] "Generic (PLEG): container finished" podID="26163ccc-7fc4-4baa-9bf0-7ca523c888ea" containerID="e10e201926ca2ee905bd41163b2570aaab40e4f3af8fe96a649f8ac826104d28" exitCode=0 Dec 01 10:15:10 crc kubenswrapper[4763]: I1201 10:15:10.773295 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjr8n" event={"ID":"26163ccc-7fc4-4baa-9bf0-7ca523c888ea","Type":"ContainerDied","Data":"e10e201926ca2ee905bd41163b2570aaab40e4f3af8fe96a649f8ac826104d28"} Dec 01 10:15:11 crc kubenswrapper[4763]: I1201 10:15:11.782661 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vjr8n" event={"ID":"26163ccc-7fc4-4baa-9bf0-7ca523c888ea","Type":"ContainerStarted","Data":"32b6035d160097bd894ba74fe2543299f1803881c0f735aff76c1497c9b2152a"} Dec 01 10:15:11 crc kubenswrapper[4763]: I1201 10:15:11.813584 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vjr8n" podStartSLOduration=3.809626611 podStartE2EDuration="11.813563067s" podCreationTimestamp="2025-12-01 10:15:00 +0000 UTC" firstStartedPulling="2025-12-01 10:15:02.661915096 +0000 UTC m=+3619.930563864" lastFinishedPulling="2025-12-01 10:15:10.665851542 +0000 UTC m=+3627.934500320" observedRunningTime="2025-12-01 10:15:11.803440519 +0000 UTC m=+3629.072089277" watchObservedRunningTime="2025-12-01 10:15:11.813563067 +0000 UTC m=+3629.082211835" Dec 01 10:15:20 crc kubenswrapper[4763]: I1201 10:15:20.896277 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:20 crc kubenswrapper[4763]: I1201 10:15:20.896714 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:20 crc kubenswrapper[4763]: I1201 10:15:20.950223 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:21 crc kubenswrapper[4763]: I1201 10:15:21.909369 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vjr8n" Dec 01 10:15:21 crc kubenswrapper[4763]: I1201 10:15:21.970887 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vjr8n"] Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.027793 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hjvw"] Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.028066 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7hjvw" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="registry-server" containerID="cri-o://630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf" gracePeriod=2 Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.775868 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hjvw" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.847972 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-catalog-content\") pod \"c11aee5c-47cc-4313-9380-0146961c94b8\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.848351 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-utilities\") pod \"c11aee5c-47cc-4313-9380-0146961c94b8\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.848495 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr6bt\" (UniqueName: \"kubernetes.io/projected/c11aee5c-47cc-4313-9380-0146961c94b8-kube-api-access-vr6bt\") pod \"c11aee5c-47cc-4313-9380-0146961c94b8\" (UID: \"c11aee5c-47cc-4313-9380-0146961c94b8\") " Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.879957 4763 generic.go:334] "Generic (PLEG): container finished" podID="c11aee5c-47cc-4313-9380-0146961c94b8" containerID="630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf" exitCode=0 Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.880015 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hjvw" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.880046 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hjvw" event={"ID":"c11aee5c-47cc-4313-9380-0146961c94b8","Type":"ContainerDied","Data":"630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf"} Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.880096 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hjvw" event={"ID":"c11aee5c-47cc-4313-9380-0146961c94b8","Type":"ContainerDied","Data":"582da9d42373f0559d839ad39b58cce8fd5719eeef9ae3393af935a8881e1cda"} Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.880114 4763 scope.go:117] "RemoveContainer" containerID="630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.892177 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-utilities" (OuterVolumeSpecName: "utilities") pod "c11aee5c-47cc-4313-9380-0146961c94b8" (UID: "c11aee5c-47cc-4313-9380-0146961c94b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.899762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11aee5c-47cc-4313-9380-0146961c94b8-kube-api-access-vr6bt" (OuterVolumeSpecName: "kube-api-access-vr6bt") pod "c11aee5c-47cc-4313-9380-0146961c94b8" (UID: "c11aee5c-47cc-4313-9380-0146961c94b8"). InnerVolumeSpecName "kube-api-access-vr6bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.911756 4763 scope.go:117] "RemoveContainer" containerID="119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.951211 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.951245 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr6bt\" (UniqueName: \"kubernetes.io/projected/c11aee5c-47cc-4313-9380-0146961c94b8-kube-api-access-vr6bt\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.963048 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c11aee5c-47cc-4313-9380-0146961c94b8" (UID: "c11aee5c-47cc-4313-9380-0146961c94b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:15:22 crc kubenswrapper[4763]: I1201 10:15:22.972299 4763 scope.go:117] "RemoveContainer" containerID="aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77" Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.026780 4763 scope.go:117] "RemoveContainer" containerID="630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf" Dec 01 10:15:23 crc kubenswrapper[4763]: E1201 10:15:23.028443 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf\": container with ID starting with 630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf not found: ID does not exist" containerID="630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf" Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.028514 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf"} err="failed to get container status \"630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf\": rpc error: code = NotFound desc = could not find container \"630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf\": container with ID starting with 630ed4c340f1e323deb119f5398db993bea8d34fa810ccbcd6b95b2c2b3d74bf not found: ID does not exist" Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.028548 4763 scope.go:117] "RemoveContainer" containerID="119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2" Dec 01 10:15:23 crc kubenswrapper[4763]: E1201 10:15:23.029836 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2\": container with ID starting with 119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2 not found: ID does not exist" containerID="119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2" Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.029860 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2"} err="failed to get container status \"119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2\": rpc error: code = NotFound desc = could not find container \"119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2\": container with ID starting with 119f8becf11f354e31054c108f149029ab381fa8960b91d610637b89980b8bc2 not found: ID does not exist" Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.029874 4763 scope.go:117] "RemoveContainer" containerID="aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77" Dec 01 10:15:23 crc kubenswrapper[4763]: E1201 10:15:23.030955 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77\": container with ID starting with aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77 not found: ID does not exist" containerID="aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77" Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.031003 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77"} err="failed to get container status \"aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77\": rpc error: code = NotFound desc = could not find container \"aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77\": container with ID starting with aa48fbf0033060c1c9a62530db61d05738f69a72a4bab5fecfd7cc576eed1e77 not found: ID does not exist" Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.053068 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c11aee5c-47cc-4313-9380-0146961c94b8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.207214 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hjvw"] Dec 01 10:15:23 crc kubenswrapper[4763]: I1201 10:15:23.230198 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7hjvw"] Dec 01 10:15:25 crc kubenswrapper[4763]: I1201 10:15:25.005861 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" path="/var/lib/kubelet/pods/c11aee5c-47cc-4313-9380-0146961c94b8/volumes" Dec 01 10:17:33 crc kubenswrapper[4763]: I1201 10:17:33.929587 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:17:33 crc kubenswrapper[4763]: I1201 10:17:33.930213 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:18:03 crc kubenswrapper[4763]: I1201 10:18:03.929330 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:18:03 crc kubenswrapper[4763]: I1201 10:18:03.930715 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:18:33 crc kubenswrapper[4763]: I1201 10:18:33.929060 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:18:33 crc kubenswrapper[4763]: I1201 10:18:33.929640 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:18:33 crc kubenswrapper[4763]: I1201 10:18:33.929697 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 10:18:33 crc kubenswrapper[4763]: I1201 10:18:33.930561 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:18:33 crc kubenswrapper[4763]: I1201 10:18:33.930627 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" gracePeriod=600 Dec 01 10:18:34 crc kubenswrapper[4763]: E1201 10:18:34.054383 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:18:34 crc kubenswrapper[4763]: I1201 10:18:34.683817 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" exitCode=0 Dec 01 10:18:34 crc kubenswrapper[4763]: I1201 10:18:34.683863 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6"} Dec 01 10:18:34 crc kubenswrapper[4763]: I1201 10:18:34.684102 4763 scope.go:117] "RemoveContainer" containerID="17efb0e546f7d9b6580631e7298f5622be7778a3f4d53f06a30f0c7d41d4ce13" Dec 01 10:18:34 crc kubenswrapper[4763]: I1201 10:18:34.685866 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:18:34 crc kubenswrapper[4763]: E1201 10:18:34.686268 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:18:45 crc kubenswrapper[4763]: I1201 10:18:45.994309 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:18:45 crc kubenswrapper[4763]: E1201 10:18:45.995099 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:18:59 crc kubenswrapper[4763]: I1201 10:18:59.995015 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:18:59 crc kubenswrapper[4763]: E1201 10:18:59.996256 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:19:13 crc kubenswrapper[4763]: I1201 10:19:13.994167 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:19:13 crc kubenswrapper[4763]: E1201 10:19:13.995075 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:19:24 crc kubenswrapper[4763]: I1201 10:19:24.996889 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:19:24 crc kubenswrapper[4763]: E1201 10:19:24.997677 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:19:38 crc kubenswrapper[4763]: I1201 10:19:38.994830 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:19:38 crc kubenswrapper[4763]: E1201 10:19:38.995950 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:19:53 crc kubenswrapper[4763]: I1201 10:19:53.994986 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:19:53 crc kubenswrapper[4763]: E1201 10:19:53.995744 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:20:05 crc kubenswrapper[4763]: I1201 10:20:05.995237 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:20:05 crc kubenswrapper[4763]: E1201 10:20:05.996158 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:20:10 crc kubenswrapper[4763]: I1201 10:20:10.046416 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-ldpps"] Dec 01 10:20:10 crc kubenswrapper[4763]: I1201 10:20:10.056160 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-ldpps"] Dec 01 10:20:11 crc kubenswrapper[4763]: I1201 10:20:11.004873 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017f4f6a-809c-4019-9645-0e24cb5f2827" path="/var/lib/kubelet/pods/017f4f6a-809c-4019-9645-0e24cb5f2827/volumes" Dec 01 10:20:12 crc kubenswrapper[4763]: I1201 10:20:12.031423 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-3e78-account-create-update-fdhhc"] Dec 01 10:20:12 crc kubenswrapper[4763]: I1201 10:20:12.042307 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-3e78-account-create-update-fdhhc"] Dec 01 10:20:13 crc kubenswrapper[4763]: I1201 10:20:13.006975 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19909ecc-64c8-4808-9447-c0ade391a43b" path="/var/lib/kubelet/pods/19909ecc-64c8-4808-9447-c0ade391a43b/volumes" Dec 01 10:20:20 crc kubenswrapper[4763]: I1201 10:20:20.994590 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:20:20 crc kubenswrapper[4763]: E1201 10:20:20.995371 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:20:33 crc kubenswrapper[4763]: I1201 10:20:33.001063 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:20:33 crc kubenswrapper[4763]: E1201 10:20:33.011554 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:20:44 crc kubenswrapper[4763]: I1201 10:20:44.995134 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:20:44 crc kubenswrapper[4763]: E1201 10:20:44.995908 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:20:56 crc kubenswrapper[4763]: I1201 10:20:56.995154 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:20:56 crc kubenswrapper[4763]: E1201 10:20:56.996048 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:21:04 crc kubenswrapper[4763]: I1201 10:21:04.070709 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-qrvr5"] Dec 01 10:21:04 crc kubenswrapper[4763]: I1201 10:21:04.082515 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-qrvr5"] Dec 01 10:21:05 crc kubenswrapper[4763]: I1201 10:21:05.008383 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b9fee43-2772-4498-8cd2-dc2ae1d8b9da" path="/var/lib/kubelet/pods/9b9fee43-2772-4498-8cd2-dc2ae1d8b9da/volumes" Dec 01 10:21:08 crc kubenswrapper[4763]: I1201 10:21:08.876748 4763 scope.go:117] "RemoveContainer" containerID="aab2da22e8c0200e46dc23f6d0d1138c5a10cbed00d157d231a6c74d6f067e18" Dec 01 10:21:08 crc kubenswrapper[4763]: I1201 10:21:08.904800 4763 scope.go:117] "RemoveContainer" containerID="e9c475c7206c829f3cfb2adc355bd659bb71faa2720e12a3ee7a7df69d3a377d" Dec 01 10:21:08 crc kubenswrapper[4763]: I1201 10:21:08.954730 4763 scope.go:117] "RemoveContainer" containerID="e517b181aa3ffdcd0ef26bd6858d4db9e4b09a586699b4d2f7537ea5e49ee60e" Dec 01 10:21:11 crc kubenswrapper[4763]: I1201 10:21:11.994786 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:21:11 crc kubenswrapper[4763]: E1201 10:21:11.996480 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:21:25 crc kubenswrapper[4763]: I1201 10:21:25.995510 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:21:25 crc kubenswrapper[4763]: E1201 10:21:25.996740 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:21:36 crc kubenswrapper[4763]: I1201 10:21:36.994123 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:21:36 crc kubenswrapper[4763]: E1201 10:21:36.995066 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.777148 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fsnqs"] Dec 01 10:21:41 crc kubenswrapper[4763]: E1201 10:21:41.778634 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="extract-content" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.778652 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="extract-content" Dec 01 10:21:41 crc kubenswrapper[4763]: E1201 10:21:41.778671 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="registry-server" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.778679 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="registry-server" Dec 01 10:21:41 crc kubenswrapper[4763]: E1201 10:21:41.778715 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028f3f7d-3fc7-477b-af14-03d248b6e9a7" containerName="collect-profiles" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.778723 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="028f3f7d-3fc7-477b-af14-03d248b6e9a7" containerName="collect-profiles" Dec 01 10:21:41 crc kubenswrapper[4763]: E1201 10:21:41.778748 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="extract-utilities" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.778757 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="extract-utilities" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.778966 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="028f3f7d-3fc7-477b-af14-03d248b6e9a7" containerName="collect-profiles" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.778997 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11aee5c-47cc-4313-9380-0146961c94b8" containerName="registry-server" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.780734 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.828468 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fsnqs"] Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.859979 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqt4\" (UniqueName: \"kubernetes.io/projected/678966e0-053a-40f9-b29f-84b8ab6dbc83-kube-api-access-8zqt4\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.860298 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678966e0-053a-40f9-b29f-84b8ab6dbc83-catalog-content\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.860560 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678966e0-053a-40f9-b29f-84b8ab6dbc83-utilities\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.962187 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678966e0-053a-40f9-b29f-84b8ab6dbc83-catalog-content\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.962352 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678966e0-053a-40f9-b29f-84b8ab6dbc83-utilities\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.962383 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqt4\" (UniqueName: \"kubernetes.io/projected/678966e0-053a-40f9-b29f-84b8ab6dbc83-kube-api-access-8zqt4\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.963142 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678966e0-053a-40f9-b29f-84b8ab6dbc83-catalog-content\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.963526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678966e0-053a-40f9-b29f-84b8ab6dbc83-utilities\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:41 crc kubenswrapper[4763]: I1201 10:21:41.986664 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqt4\" (UniqueName: \"kubernetes.io/projected/678966e0-053a-40f9-b29f-84b8ab6dbc83-kube-api-access-8zqt4\") pod \"redhat-operators-fsnqs\" (UID: \"678966e0-053a-40f9-b29f-84b8ab6dbc83\") " pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:42 crc kubenswrapper[4763]: I1201 10:21:42.142755 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:21:43 crc kubenswrapper[4763]: I1201 10:21:43.414829 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fsnqs"] Dec 01 10:21:44 crc kubenswrapper[4763]: I1201 10:21:44.401659 4763 generic.go:334] "Generic (PLEG): container finished" podID="678966e0-053a-40f9-b29f-84b8ab6dbc83" containerID="e378f51241116f6efef4dd40987bf1005ff72c245b94c91bc9ff929e2678458c" exitCode=0 Dec 01 10:21:44 crc kubenswrapper[4763]: I1201 10:21:44.401770 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsnqs" event={"ID":"678966e0-053a-40f9-b29f-84b8ab6dbc83","Type":"ContainerDied","Data":"e378f51241116f6efef4dd40987bf1005ff72c245b94c91bc9ff929e2678458c"} Dec 01 10:21:44 crc kubenswrapper[4763]: I1201 10:21:44.402016 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsnqs" event={"ID":"678966e0-053a-40f9-b29f-84b8ab6dbc83","Type":"ContainerStarted","Data":"bdfe867b43299075a088d856e6fedd3456ea6f9561df5e29b0a104613f70f2b6"} Dec 01 10:21:44 crc kubenswrapper[4763]: I1201 10:21:44.404885 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:21:51 crc kubenswrapper[4763]: I1201 10:21:51.994270 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:21:51 crc kubenswrapper[4763]: E1201 10:21:51.995107 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:21:54 crc kubenswrapper[4763]: I1201 10:21:54.505111 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsnqs" event={"ID":"678966e0-053a-40f9-b29f-84b8ab6dbc83","Type":"ContainerStarted","Data":"2ee16163bda3c7c9008585171363b1da39c39f5b9d26b8a5f41a5736e2fd6c71"} Dec 01 10:21:57 crc kubenswrapper[4763]: I1201 10:21:57.541641 4763 generic.go:334] "Generic (PLEG): container finished" podID="678966e0-053a-40f9-b29f-84b8ab6dbc83" containerID="2ee16163bda3c7c9008585171363b1da39c39f5b9d26b8a5f41a5736e2fd6c71" exitCode=0 Dec 01 10:21:57 crc kubenswrapper[4763]: I1201 10:21:57.541704 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsnqs" event={"ID":"678966e0-053a-40f9-b29f-84b8ab6dbc83","Type":"ContainerDied","Data":"2ee16163bda3c7c9008585171363b1da39c39f5b9d26b8a5f41a5736e2fd6c71"} Dec 01 10:21:58 crc kubenswrapper[4763]: I1201 10:21:58.551992 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsnqs" event={"ID":"678966e0-053a-40f9-b29f-84b8ab6dbc83","Type":"ContainerStarted","Data":"a3c9f0a9db27dd1f41925544e31038dc829a1b82c6363eeaf44dbed6cb21d4d1"} Dec 01 10:21:58 crc kubenswrapper[4763]: I1201 10:21:58.576022 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fsnqs" podStartSLOduration=3.7347633670000002 podStartE2EDuration="17.576005112s" podCreationTimestamp="2025-12-01 10:21:41 +0000 UTC" firstStartedPulling="2025-12-01 10:21:44.403971231 +0000 UTC m=+4021.672619989" lastFinishedPulling="2025-12-01 10:21:58.245212966 +0000 UTC m=+4035.513861734" observedRunningTime="2025-12-01 10:21:58.568945703 +0000 UTC m=+4035.837594481" watchObservedRunningTime="2025-12-01 10:21:58.576005112 +0000 UTC m=+4035.844653880" Dec 01 10:22:02 crc kubenswrapper[4763]: I1201 10:22:02.143945 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:22:02 crc kubenswrapper[4763]: I1201 10:22:02.144734 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:22:03 crc kubenswrapper[4763]: I1201 10:22:03.002355 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:22:03 crc kubenswrapper[4763]: E1201 10:22:03.002974 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:22:03 crc kubenswrapper[4763]: I1201 10:22:03.192333 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fsnqs" podUID="678966e0-053a-40f9-b29f-84b8ab6dbc83" containerName="registry-server" probeResult="failure" output=< Dec 01 10:22:03 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 10:22:03 crc kubenswrapper[4763]: > Dec 01 10:22:12 crc kubenswrapper[4763]: I1201 10:22:12.208355 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:22:12 crc kubenswrapper[4763]: I1201 10:22:12.261361 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fsnqs" Dec 01 10:22:12 crc kubenswrapper[4763]: I1201 10:22:12.794648 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fsnqs"] Dec 01 10:22:12 crc kubenswrapper[4763]: I1201 10:22:12.976143 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5qhw"] Dec 01 10:22:12 crc kubenswrapper[4763]: I1201 10:22:12.976408 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5qhw" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="registry-server" containerID="cri-o://f7d0659b8bf94921bbf5a75b0d4b553d21ed98f63fce659a55bb72aee3c36b52" gracePeriod=2 Dec 01 10:22:13 crc kubenswrapper[4763]: I1201 10:22:13.695833 4763 generic.go:334] "Generic (PLEG): container finished" podID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerID="f7d0659b8bf94921bbf5a75b0d4b553d21ed98f63fce659a55bb72aee3c36b52" exitCode=0 Dec 01 10:22:13 crc kubenswrapper[4763]: I1201 10:22:13.695921 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5qhw" event={"ID":"9f5677a7-52b7-41f2-b50a-aabcca5dd27c","Type":"ContainerDied","Data":"f7d0659b8bf94921bbf5a75b0d4b553d21ed98f63fce659a55bb72aee3c36b52"} Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.036952 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.134413 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-catalog-content\") pod \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.134594 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-utilities\") pod \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.134974 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7nns\" (UniqueName: \"kubernetes.io/projected/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-kube-api-access-x7nns\") pod \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\" (UID: \"9f5677a7-52b7-41f2-b50a-aabcca5dd27c\") " Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.135378 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-utilities" (OuterVolumeSpecName: "utilities") pod "9f5677a7-52b7-41f2-b50a-aabcca5dd27c" (UID: "9f5677a7-52b7-41f2-b50a-aabcca5dd27c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.135601 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.144447 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-kube-api-access-x7nns" (OuterVolumeSpecName: "kube-api-access-x7nns") pod "9f5677a7-52b7-41f2-b50a-aabcca5dd27c" (UID: "9f5677a7-52b7-41f2-b50a-aabcca5dd27c"). InnerVolumeSpecName "kube-api-access-x7nns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.237709 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7nns\" (UniqueName: \"kubernetes.io/projected/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-kube-api-access-x7nns\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.260773 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f5677a7-52b7-41f2-b50a-aabcca5dd27c" (UID: "9f5677a7-52b7-41f2-b50a-aabcca5dd27c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.339434 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f5677a7-52b7-41f2-b50a-aabcca5dd27c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.706869 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5qhw" event={"ID":"9f5677a7-52b7-41f2-b50a-aabcca5dd27c","Type":"ContainerDied","Data":"5fe832b3b6f95af257dca9a024ba5f1d8f482dbfc314bdc48521009a334ca569"} Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.706912 4763 scope.go:117] "RemoveContainer" containerID="f7d0659b8bf94921bbf5a75b0d4b553d21ed98f63fce659a55bb72aee3c36b52" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.706925 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5qhw" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.744888 4763 scope.go:117] "RemoveContainer" containerID="66d19425da46fc2df5d3e310fcc0e24a5b27140f10cc90f9f949424361d4f28b" Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.747007 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5qhw"] Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.756819 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5qhw"] Dec 01 10:22:14 crc kubenswrapper[4763]: I1201 10:22:14.775800 4763 scope.go:117] "RemoveContainer" containerID="5d9fb1b6275f2b6e09fec053aed2c81a419a0f6d21eb6f33af6388cde8d4c39f" Dec 01 10:22:15 crc kubenswrapper[4763]: I1201 10:22:15.007956 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" path="/var/lib/kubelet/pods/9f5677a7-52b7-41f2-b50a-aabcca5dd27c/volumes" Dec 01 10:22:16 crc kubenswrapper[4763]: I1201 10:22:16.994543 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:22:16 crc kubenswrapper[4763]: E1201 10:22:16.995053 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:22:30 crc kubenswrapper[4763]: I1201 10:22:30.995116 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:22:30 crc kubenswrapper[4763]: E1201 10:22:30.995848 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.385952 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t5zjv"] Dec 01 10:22:36 crc kubenswrapper[4763]: E1201 10:22:36.387229 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="registry-server" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.387253 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="registry-server" Dec 01 10:22:36 crc kubenswrapper[4763]: E1201 10:22:36.387309 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="extract-content" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.387321 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="extract-content" Dec 01 10:22:36 crc kubenswrapper[4763]: E1201 10:22:36.387337 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="extract-utilities" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.387349 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="extract-utilities" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.387722 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5677a7-52b7-41f2-b50a-aabcca5dd27c" containerName="registry-server" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.389888 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.402612 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5zjv"] Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.547429 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-utilities\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.547486 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhljg\" (UniqueName: \"kubernetes.io/projected/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-kube-api-access-fhljg\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.547691 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-catalog-content\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.650091 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-utilities\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.650137 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhljg\" (UniqueName: \"kubernetes.io/projected/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-kube-api-access-fhljg\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.650182 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-catalog-content\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.651171 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-utilities\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.650767 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-catalog-content\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.673118 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhljg\" (UniqueName: \"kubernetes.io/projected/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-kube-api-access-fhljg\") pod \"redhat-marketplace-t5zjv\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:36 crc kubenswrapper[4763]: I1201 10:22:36.716631 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:37 crc kubenswrapper[4763]: I1201 10:22:37.191723 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5zjv"] Dec 01 10:22:37 crc kubenswrapper[4763]: I1201 10:22:37.965213 4763 generic.go:334] "Generic (PLEG): container finished" podID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerID="49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475" exitCode=0 Dec 01 10:22:37 crc kubenswrapper[4763]: I1201 10:22:37.965318 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5zjv" event={"ID":"1fdf6c26-0898-40b8-8c2e-87cd60fc6529","Type":"ContainerDied","Data":"49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475"} Dec 01 10:22:37 crc kubenswrapper[4763]: I1201 10:22:37.965556 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5zjv" event={"ID":"1fdf6c26-0898-40b8-8c2e-87cd60fc6529","Type":"ContainerStarted","Data":"2e8de90de82c2b1a44d85ad0eb7fb393747dbf40710d6041cb1b13b4d158b532"} Dec 01 10:22:39 crc kubenswrapper[4763]: I1201 10:22:39.986953 4763 generic.go:334] "Generic (PLEG): container finished" podID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerID="ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c" exitCode=0 Dec 01 10:22:39 crc kubenswrapper[4763]: I1201 10:22:39.986995 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5zjv" event={"ID":"1fdf6c26-0898-40b8-8c2e-87cd60fc6529","Type":"ContainerDied","Data":"ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c"} Dec 01 10:22:41 crc kubenswrapper[4763]: I1201 10:22:41.006393 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5zjv" event={"ID":"1fdf6c26-0898-40b8-8c2e-87cd60fc6529","Type":"ContainerStarted","Data":"93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55"} Dec 01 10:22:41 crc kubenswrapper[4763]: I1201 10:22:41.043796 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t5zjv" podStartSLOduration=2.469355305 podStartE2EDuration="5.04377754s" podCreationTimestamp="2025-12-01 10:22:36 +0000 UTC" firstStartedPulling="2025-12-01 10:22:37.968191557 +0000 UTC m=+4075.236840375" lastFinishedPulling="2025-12-01 10:22:40.542613842 +0000 UTC m=+4077.811262610" observedRunningTime="2025-12-01 10:22:41.042680181 +0000 UTC m=+4078.311328949" watchObservedRunningTime="2025-12-01 10:22:41.04377754 +0000 UTC m=+4078.312426308" Dec 01 10:22:44 crc kubenswrapper[4763]: I1201 10:22:44.994269 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:22:44 crc kubenswrapper[4763]: E1201 10:22:44.995170 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:22:46 crc kubenswrapper[4763]: I1201 10:22:46.717577 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:46 crc kubenswrapper[4763]: I1201 10:22:46.717949 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:46 crc kubenswrapper[4763]: I1201 10:22:46.767742 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:47 crc kubenswrapper[4763]: I1201 10:22:47.109341 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:47 crc kubenswrapper[4763]: I1201 10:22:47.162295 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5zjv"] Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.091081 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t5zjv" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerName="registry-server" containerID="cri-o://93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55" gracePeriod=2 Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.572485 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.766151 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-catalog-content\") pod \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.766244 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-utilities\") pod \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.766279 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhljg\" (UniqueName: \"kubernetes.io/projected/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-kube-api-access-fhljg\") pod \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\" (UID: \"1fdf6c26-0898-40b8-8c2e-87cd60fc6529\") " Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.768179 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-utilities" (OuterVolumeSpecName: "utilities") pod "1fdf6c26-0898-40b8-8c2e-87cd60fc6529" (UID: "1fdf6c26-0898-40b8-8c2e-87cd60fc6529"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.776280 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-kube-api-access-fhljg" (OuterVolumeSpecName: "kube-api-access-fhljg") pod "1fdf6c26-0898-40b8-8c2e-87cd60fc6529" (UID: "1fdf6c26-0898-40b8-8c2e-87cd60fc6529"). InnerVolumeSpecName "kube-api-access-fhljg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.788146 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fdf6c26-0898-40b8-8c2e-87cd60fc6529" (UID: "1fdf6c26-0898-40b8-8c2e-87cd60fc6529"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.869185 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.869221 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:49 crc kubenswrapper[4763]: I1201 10:22:49.869235 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhljg\" (UniqueName: \"kubernetes.io/projected/1fdf6c26-0898-40b8-8c2e-87cd60fc6529-kube-api-access-fhljg\") on node \"crc\" DevicePath \"\"" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.105425 4763 generic.go:334] "Generic (PLEG): container finished" podID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerID="93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55" exitCode=0 Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.105652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5zjv" event={"ID":"1fdf6c26-0898-40b8-8c2e-87cd60fc6529","Type":"ContainerDied","Data":"93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55"} Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.106063 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5zjv" event={"ID":"1fdf6c26-0898-40b8-8c2e-87cd60fc6529","Type":"ContainerDied","Data":"2e8de90de82c2b1a44d85ad0eb7fb393747dbf40710d6041cb1b13b4d158b532"} Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.106110 4763 scope.go:117] "RemoveContainer" containerID="93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.105807 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5zjv" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.142643 4763 scope.go:117] "RemoveContainer" containerID="ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.149603 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5zjv"] Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.159127 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5zjv"] Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.166634 4763 scope.go:117] "RemoveContainer" containerID="49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.217486 4763 scope.go:117] "RemoveContainer" containerID="93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55" Dec 01 10:22:50 crc kubenswrapper[4763]: E1201 10:22:50.217912 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55\": container with ID starting with 93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55 not found: ID does not exist" containerID="93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.217952 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55"} err="failed to get container status \"93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55\": rpc error: code = NotFound desc = could not find container \"93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55\": container with ID starting with 93c4ba66fc8c79f1d441c7f3b52cdba00d12b626f08c37c44ff769b9dc94fa55 not found: ID does not exist" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.217976 4763 scope.go:117] "RemoveContainer" containerID="ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c" Dec 01 10:22:50 crc kubenswrapper[4763]: E1201 10:22:50.219064 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c\": container with ID starting with ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c not found: ID does not exist" containerID="ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.219087 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c"} err="failed to get container status \"ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c\": rpc error: code = NotFound desc = could not find container \"ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c\": container with ID starting with ce6fe4096e52c3da4996111bfb9363609f24482b05017e4c55a28efbf3e9458c not found: ID does not exist" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.219099 4763 scope.go:117] "RemoveContainer" containerID="49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475" Dec 01 10:22:50 crc kubenswrapper[4763]: E1201 10:22:50.219357 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475\": container with ID starting with 49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475 not found: ID does not exist" containerID="49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475" Dec 01 10:22:50 crc kubenswrapper[4763]: I1201 10:22:50.219378 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475"} err="failed to get container status \"49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475\": rpc error: code = NotFound desc = could not find container \"49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475\": container with ID starting with 49d00f93ba37681e02977fffcc9e53d25d9b9ce42629401d3b691e4229661475 not found: ID does not exist" Dec 01 10:22:51 crc kubenswrapper[4763]: I1201 10:22:51.004596 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" path="/var/lib/kubelet/pods/1fdf6c26-0898-40b8-8c2e-87cd60fc6529/volumes" Dec 01 10:22:56 crc kubenswrapper[4763]: I1201 10:22:56.994074 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:22:56 crc kubenswrapper[4763]: E1201 10:22:56.994833 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:23:11 crc kubenswrapper[4763]: I1201 10:23:11.994152 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:23:11 crc kubenswrapper[4763]: E1201 10:23:11.994991 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:23:23 crc kubenswrapper[4763]: I1201 10:23:23.022714 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:23:23 crc kubenswrapper[4763]: E1201 10:23:23.024781 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:23:33 crc kubenswrapper[4763]: I1201 10:23:33.994631 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:23:34 crc kubenswrapper[4763]: I1201 10:23:34.567677 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"721abf0c9085ab0076360747c073c1cfd89cac6d0e20042d7bbd0698eb2ef82e"} Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.613841 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fwrm7"] Dec 01 10:24:06 crc kubenswrapper[4763]: E1201 10:24:06.615038 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerName="extract-utilities" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.615060 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerName="extract-utilities" Dec 01 10:24:06 crc kubenswrapper[4763]: E1201 10:24:06.615078 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerName="registry-server" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.615086 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerName="registry-server" Dec 01 10:24:06 crc kubenswrapper[4763]: E1201 10:24:06.615100 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerName="extract-content" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.615109 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerName="extract-content" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.615340 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fdf6c26-0898-40b8-8c2e-87cd60fc6529" containerName="registry-server" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.626320 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.630484 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwrm7"] Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.698912 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-catalog-content\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.699337 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv8sz\" (UniqueName: \"kubernetes.io/projected/2ea24eed-1e49-4142-b60c-791a25a3d29d-kube-api-access-bv8sz\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.699575 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-utilities\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.800492 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv8sz\" (UniqueName: \"kubernetes.io/projected/2ea24eed-1e49-4142-b60c-791a25a3d29d-kube-api-access-bv8sz\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.800615 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-utilities\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.800675 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-catalog-content\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.801086 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-utilities\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.801104 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-catalog-content\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.821963 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv8sz\" (UniqueName: \"kubernetes.io/projected/2ea24eed-1e49-4142-b60c-791a25a3d29d-kube-api-access-bv8sz\") pod \"certified-operators-fwrm7\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:06 crc kubenswrapper[4763]: I1201 10:24:06.963636 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:07 crc kubenswrapper[4763]: I1201 10:24:07.553635 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fwrm7"] Dec 01 10:24:07 crc kubenswrapper[4763]: I1201 10:24:07.885919 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerID="c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0" exitCode=0 Dec 01 10:24:07 crc kubenswrapper[4763]: I1201 10:24:07.886095 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwrm7" event={"ID":"2ea24eed-1e49-4142-b60c-791a25a3d29d","Type":"ContainerDied","Data":"c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0"} Dec 01 10:24:07 crc kubenswrapper[4763]: I1201 10:24:07.886259 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwrm7" event={"ID":"2ea24eed-1e49-4142-b60c-791a25a3d29d","Type":"ContainerStarted","Data":"a7be3df4362be14e1c87e8cd837bb9a8f3a773b85c431adc869ead16add3d798"} Dec 01 10:24:09 crc kubenswrapper[4763]: I1201 10:24:09.908412 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwrm7" event={"ID":"2ea24eed-1e49-4142-b60c-791a25a3d29d","Type":"ContainerStarted","Data":"0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf"} Dec 01 10:24:10 crc kubenswrapper[4763]: I1201 10:24:10.940848 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerID="0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf" exitCode=0 Dec 01 10:24:10 crc kubenswrapper[4763]: I1201 10:24:10.940905 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwrm7" event={"ID":"2ea24eed-1e49-4142-b60c-791a25a3d29d","Type":"ContainerDied","Data":"0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf"} Dec 01 10:24:11 crc kubenswrapper[4763]: I1201 10:24:11.955655 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwrm7" event={"ID":"2ea24eed-1e49-4142-b60c-791a25a3d29d","Type":"ContainerStarted","Data":"e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a"} Dec 01 10:24:11 crc kubenswrapper[4763]: I1201 10:24:11.974849 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fwrm7" podStartSLOduration=2.42578003 podStartE2EDuration="5.974815835s" podCreationTimestamp="2025-12-01 10:24:06 +0000 UTC" firstStartedPulling="2025-12-01 10:24:07.887881665 +0000 UTC m=+4165.156530473" lastFinishedPulling="2025-12-01 10:24:11.43691751 +0000 UTC m=+4168.705566278" observedRunningTime="2025-12-01 10:24:11.973895791 +0000 UTC m=+4169.242544559" watchObservedRunningTime="2025-12-01 10:24:11.974815835 +0000 UTC m=+4169.243464603" Dec 01 10:24:16 crc kubenswrapper[4763]: I1201 10:24:16.963881 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:16 crc kubenswrapper[4763]: I1201 10:24:16.964442 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:17 crc kubenswrapper[4763]: I1201 10:24:17.027257 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:17 crc kubenswrapper[4763]: I1201 10:24:17.084408 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:17 crc kubenswrapper[4763]: I1201 10:24:17.264211 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwrm7"] Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.028252 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fwrm7" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerName="registry-server" containerID="cri-o://e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a" gracePeriod=2 Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.596252 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.781583 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv8sz\" (UniqueName: \"kubernetes.io/projected/2ea24eed-1e49-4142-b60c-791a25a3d29d-kube-api-access-bv8sz\") pod \"2ea24eed-1e49-4142-b60c-791a25a3d29d\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.781652 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-utilities\") pod \"2ea24eed-1e49-4142-b60c-791a25a3d29d\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.781702 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-catalog-content\") pod \"2ea24eed-1e49-4142-b60c-791a25a3d29d\" (UID: \"2ea24eed-1e49-4142-b60c-791a25a3d29d\") " Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.782541 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-utilities" (OuterVolumeSpecName: "utilities") pod "2ea24eed-1e49-4142-b60c-791a25a3d29d" (UID: "2ea24eed-1e49-4142-b60c-791a25a3d29d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.791207 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea24eed-1e49-4142-b60c-791a25a3d29d-kube-api-access-bv8sz" (OuterVolumeSpecName: "kube-api-access-bv8sz") pod "2ea24eed-1e49-4142-b60c-791a25a3d29d" (UID: "2ea24eed-1e49-4142-b60c-791a25a3d29d"). InnerVolumeSpecName "kube-api-access-bv8sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.827319 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ea24eed-1e49-4142-b60c-791a25a3d29d" (UID: "2ea24eed-1e49-4142-b60c-791a25a3d29d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.884493 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv8sz\" (UniqueName: \"kubernetes.io/projected/2ea24eed-1e49-4142-b60c-791a25a3d29d-kube-api-access-bv8sz\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.884531 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:19 crc kubenswrapper[4763]: I1201 10:24:19.884547 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ea24eed-1e49-4142-b60c-791a25a3d29d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.040492 4763 generic.go:334] "Generic (PLEG): container finished" podID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerID="e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a" exitCode=0 Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.040579 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fwrm7" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.040562 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwrm7" event={"ID":"2ea24eed-1e49-4142-b60c-791a25a3d29d","Type":"ContainerDied","Data":"e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a"} Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.040685 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fwrm7" event={"ID":"2ea24eed-1e49-4142-b60c-791a25a3d29d","Type":"ContainerDied","Data":"a7be3df4362be14e1c87e8cd837bb9a8f3a773b85c431adc869ead16add3d798"} Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.040733 4763 scope.go:117] "RemoveContainer" containerID="e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.081890 4763 scope.go:117] "RemoveContainer" containerID="0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.112211 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fwrm7"] Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.121080 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fwrm7"] Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.126006 4763 scope.go:117] "RemoveContainer" containerID="c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.182677 4763 scope.go:117] "RemoveContainer" containerID="e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a" Dec 01 10:24:20 crc kubenswrapper[4763]: E1201 10:24:20.185426 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a\": container with ID starting with e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a not found: ID does not exist" containerID="e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.185516 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a"} err="failed to get container status \"e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a\": rpc error: code = NotFound desc = could not find container \"e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a\": container with ID starting with e985aa6fca3f63a48c47ae0020065d148fb6de37e319baacadcb6131e2e7813a not found: ID does not exist" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.185556 4763 scope.go:117] "RemoveContainer" containerID="0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf" Dec 01 10:24:20 crc kubenswrapper[4763]: E1201 10:24:20.186390 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf\": container with ID starting with 0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf not found: ID does not exist" containerID="0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.186425 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf"} err="failed to get container status \"0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf\": rpc error: code = NotFound desc = could not find container \"0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf\": container with ID starting with 0d10e673d4d4373288ecf85758ee35c980f29567eda928fecc37c268122c99cf not found: ID does not exist" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.186492 4763 scope.go:117] "RemoveContainer" containerID="c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0" Dec 01 10:24:20 crc kubenswrapper[4763]: E1201 10:24:20.187206 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0\": container with ID starting with c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0 not found: ID does not exist" containerID="c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0" Dec 01 10:24:20 crc kubenswrapper[4763]: I1201 10:24:20.187240 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0"} err="failed to get container status \"c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0\": rpc error: code = NotFound desc = could not find container \"c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0\": container with ID starting with c46fc5da681f6e37cdd3f4c11595cb5fe9e4e41b53387a273e20e53d754c6bf0 not found: ID does not exist" Dec 01 10:24:21 crc kubenswrapper[4763]: I1201 10:24:21.007235 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" path="/var/lib/kubelet/pods/2ea24eed-1e49-4142-b60c-791a25a3d29d/volumes" Dec 01 10:25:42 crc kubenswrapper[4763]: I1201 10:25:42.907186 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8c4pm"] Dec 01 10:25:42 crc kubenswrapper[4763]: E1201 10:25:42.908360 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerName="extract-utilities" Dec 01 10:25:42 crc kubenswrapper[4763]: I1201 10:25:42.908384 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerName="extract-utilities" Dec 01 10:25:42 crc kubenswrapper[4763]: E1201 10:25:42.908412 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerName="extract-content" Dec 01 10:25:42 crc kubenswrapper[4763]: I1201 10:25:42.908423 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerName="extract-content" Dec 01 10:25:42 crc kubenswrapper[4763]: E1201 10:25:42.908450 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerName="registry-server" Dec 01 10:25:42 crc kubenswrapper[4763]: I1201 10:25:42.908579 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerName="registry-server" Dec 01 10:25:42 crc kubenswrapper[4763]: I1201 10:25:42.908857 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea24eed-1e49-4142-b60c-791a25a3d29d" containerName="registry-server" Dec 01 10:25:42 crc kubenswrapper[4763]: I1201 10:25:42.910671 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:42 crc kubenswrapper[4763]: I1201 10:25:42.943794 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8c4pm"] Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.015608 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj5dn\" (UniqueName: \"kubernetes.io/projected/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-kube-api-access-kj5dn\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.015738 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-catalog-content\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.015769 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-utilities\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.117632 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-catalog-content\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.117694 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-utilities\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.117805 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj5dn\" (UniqueName: \"kubernetes.io/projected/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-kube-api-access-kj5dn\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.118747 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-catalog-content\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.119023 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-utilities\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.164162 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj5dn\" (UniqueName: \"kubernetes.io/projected/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-kube-api-access-kj5dn\") pod \"community-operators-8c4pm\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.247368 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.760734 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8c4pm"] Dec 01 10:25:43 crc kubenswrapper[4763]: I1201 10:25:43.964345 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c4pm" event={"ID":"30423622-4a52-4f6b-84ef-fbc2fe05b1c6","Type":"ContainerStarted","Data":"90b6374ad1b07e067fe0ddcbf736c26b913885b9d1e3e2d4254544a11f0117c6"} Dec 01 10:25:44 crc kubenswrapper[4763]: I1201 10:25:44.973935 4763 generic.go:334] "Generic (PLEG): container finished" podID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerID="6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0" exitCode=0 Dec 01 10:25:44 crc kubenswrapper[4763]: I1201 10:25:44.973979 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c4pm" event={"ID":"30423622-4a52-4f6b-84ef-fbc2fe05b1c6","Type":"ContainerDied","Data":"6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0"} Dec 01 10:25:45 crc kubenswrapper[4763]: I1201 10:25:45.985360 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c4pm" event={"ID":"30423622-4a52-4f6b-84ef-fbc2fe05b1c6","Type":"ContainerStarted","Data":"118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91"} Dec 01 10:25:46 crc kubenswrapper[4763]: I1201 10:25:46.996680 4763 generic.go:334] "Generic (PLEG): container finished" podID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerID="118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91" exitCode=0 Dec 01 10:25:47 crc kubenswrapper[4763]: I1201 10:25:47.007544 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c4pm" event={"ID":"30423622-4a52-4f6b-84ef-fbc2fe05b1c6","Type":"ContainerDied","Data":"118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91"} Dec 01 10:25:49 crc kubenswrapper[4763]: I1201 10:25:49.017724 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c4pm" event={"ID":"30423622-4a52-4f6b-84ef-fbc2fe05b1c6","Type":"ContainerStarted","Data":"e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539"} Dec 01 10:25:49 crc kubenswrapper[4763]: I1201 10:25:49.110047 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8c4pm" podStartSLOduration=4.203604574 podStartE2EDuration="7.107771676s" podCreationTimestamp="2025-12-01 10:25:42 +0000 UTC" firstStartedPulling="2025-12-01 10:25:44.977192992 +0000 UTC m=+4262.245841760" lastFinishedPulling="2025-12-01 10:25:47.881360094 +0000 UTC m=+4265.150008862" observedRunningTime="2025-12-01 10:25:49.09227214 +0000 UTC m=+4266.360920928" watchObservedRunningTime="2025-12-01 10:25:49.107771676 +0000 UTC m=+4266.376420444" Dec 01 10:25:53 crc kubenswrapper[4763]: I1201 10:25:53.248060 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:53 crc kubenswrapper[4763]: I1201 10:25:53.248553 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:53 crc kubenswrapper[4763]: I1201 10:25:53.298019 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:54 crc kubenswrapper[4763]: I1201 10:25:54.124970 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:54 crc kubenswrapper[4763]: I1201 10:25:54.192334 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8c4pm"] Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.090193 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8c4pm" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerName="registry-server" containerID="cri-o://e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539" gracePeriod=2 Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.650251 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.790322 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj5dn\" (UniqueName: \"kubernetes.io/projected/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-kube-api-access-kj5dn\") pod \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.790573 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-utilities\") pod \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.790678 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-catalog-content\") pod \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\" (UID: \"30423622-4a52-4f6b-84ef-fbc2fe05b1c6\") " Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.791671 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-utilities" (OuterVolumeSpecName: "utilities") pod "30423622-4a52-4f6b-84ef-fbc2fe05b1c6" (UID: "30423622-4a52-4f6b-84ef-fbc2fe05b1c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.804304 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-kube-api-access-kj5dn" (OuterVolumeSpecName: "kube-api-access-kj5dn") pod "30423622-4a52-4f6b-84ef-fbc2fe05b1c6" (UID: "30423622-4a52-4f6b-84ef-fbc2fe05b1c6"). InnerVolumeSpecName "kube-api-access-kj5dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.845083 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30423622-4a52-4f6b-84ef-fbc2fe05b1c6" (UID: "30423622-4a52-4f6b-84ef-fbc2fe05b1c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.893911 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj5dn\" (UniqueName: \"kubernetes.io/projected/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-kube-api-access-kj5dn\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.893946 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:56 crc kubenswrapper[4763]: I1201 10:25:56.893957 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30423622-4a52-4f6b-84ef-fbc2fe05b1c6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.100439 4763 generic.go:334] "Generic (PLEG): container finished" podID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerID="e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539" exitCode=0 Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.100498 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c4pm" event={"ID":"30423622-4a52-4f6b-84ef-fbc2fe05b1c6","Type":"ContainerDied","Data":"e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539"} Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.100528 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8c4pm" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.100533 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8c4pm" event={"ID":"30423622-4a52-4f6b-84ef-fbc2fe05b1c6","Type":"ContainerDied","Data":"90b6374ad1b07e067fe0ddcbf736c26b913885b9d1e3e2d4254544a11f0117c6"} Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.100556 4763 scope.go:117] "RemoveContainer" containerID="e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.129549 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8c4pm"] Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.130706 4763 scope.go:117] "RemoveContainer" containerID="118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.141798 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8c4pm"] Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.158051 4763 scope.go:117] "RemoveContainer" containerID="6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.219775 4763 scope.go:117] "RemoveContainer" containerID="e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539" Dec 01 10:25:57 crc kubenswrapper[4763]: E1201 10:25:57.220202 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539\": container with ID starting with e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539 not found: ID does not exist" containerID="e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.220253 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539"} err="failed to get container status \"e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539\": rpc error: code = NotFound desc = could not find container \"e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539\": container with ID starting with e0b992f3123be5e790314d769d036b00a6115fe68cf935bd14b97bf98eb0d539 not found: ID does not exist" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.220285 4763 scope.go:117] "RemoveContainer" containerID="118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91" Dec 01 10:25:57 crc kubenswrapper[4763]: E1201 10:25:57.220603 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91\": container with ID starting with 118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91 not found: ID does not exist" containerID="118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.220642 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91"} err="failed to get container status \"118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91\": rpc error: code = NotFound desc = could not find container \"118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91\": container with ID starting with 118e35ce3e298225bead84d98a1dc409baf7ab22e56aaf532da1303ef7685d91 not found: ID does not exist" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.220661 4763 scope.go:117] "RemoveContainer" containerID="6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0" Dec 01 10:25:57 crc kubenswrapper[4763]: E1201 10:25:57.220886 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0\": container with ID starting with 6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0 not found: ID does not exist" containerID="6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0" Dec 01 10:25:57 crc kubenswrapper[4763]: I1201 10:25:57.220917 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0"} err="failed to get container status \"6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0\": rpc error: code = NotFound desc = could not find container \"6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0\": container with ID starting with 6b6b58410f9512b7415cfee4fa2287b8f0b16b86982eb768727b2e132a1f92f0 not found: ID does not exist" Dec 01 10:25:59 crc kubenswrapper[4763]: I1201 10:25:59.005830 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" path="/var/lib/kubelet/pods/30423622-4a52-4f6b-84ef-fbc2fe05b1c6/volumes" Dec 01 10:26:03 crc kubenswrapper[4763]: I1201 10:26:03.929752 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:26:03 crc kubenswrapper[4763]: I1201 10:26:03.930385 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:26:33 crc kubenswrapper[4763]: I1201 10:26:33.929255 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:26:33 crc kubenswrapper[4763]: I1201 10:26:33.929878 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:26:51 crc kubenswrapper[4763]: I1201 10:26:51.618762 4763 generic.go:334] "Generic (PLEG): container finished" podID="74d118c3-e544-4a7a-ad22-de496e16f9ee" containerID="1e41c8399655877e1c581b04d2303ad6b82cd87dc49b44233448dbc0ab8abcd4" exitCode=0 Dec 01 10:26:51 crc kubenswrapper[4763]: I1201 10:26:51.619289 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"74d118c3-e544-4a7a-ad22-de496e16f9ee","Type":"ContainerDied","Data":"1e41c8399655877e1c581b04d2303ad6b82cd87dc49b44233448dbc0ab8abcd4"} Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.063923 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.197672 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t49vh\" (UniqueName: \"kubernetes.io/projected/74d118c3-e544-4a7a-ad22-de496e16f9ee-kube-api-access-t49vh\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.197751 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.197853 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config-secret\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.197931 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-temporary\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.197963 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ssh-key\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.198007 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-config-data\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.198033 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ca-certs\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.198103 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.198159 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-workdir\") pod \"74d118c3-e544-4a7a-ad22-de496e16f9ee\" (UID: \"74d118c3-e544-4a7a-ad22-de496e16f9ee\") " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.199584 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.203732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-config-data" (OuterVolumeSpecName: "config-data") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.207115 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.212968 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d118c3-e544-4a7a-ad22-de496e16f9ee-kube-api-access-t49vh" (OuterVolumeSpecName: "kube-api-access-t49vh") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "kube-api-access-t49vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.215003 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.239387 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.239638 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.252347 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.271295 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "74d118c3-e544-4a7a-ad22-de496e16f9ee" (UID: "74d118c3-e544-4a7a-ad22-de496e16f9ee"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.300579 4763 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.300631 4763 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.300644 4763 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.300658 4763 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.301552 4763 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.301578 4763 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/74d118c3-e544-4a7a-ad22-de496e16f9ee-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.306631 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t49vh\" (UniqueName: \"kubernetes.io/projected/74d118c3-e544-4a7a-ad22-de496e16f9ee-kube-api-access-t49vh\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.306659 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.306684 4763 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/74d118c3-e544-4a7a-ad22-de496e16f9ee-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.327406 4763 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.409634 4763 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.649423 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"74d118c3-e544-4a7a-ad22-de496e16f9ee","Type":"ContainerDied","Data":"a043188c8ae77975dd6aa4d3b81ea337abe359a7934a4b50450abf85dec3f556"} Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.650073 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a043188c8ae77975dd6aa4d3b81ea337abe359a7934a4b50450abf85dec3f556" Dec 01 10:26:53 crc kubenswrapper[4763]: I1201 10:26:53.650235 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.408134 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:27:02 crc kubenswrapper[4763]: E1201 10:27:02.409039 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerName="registry-server" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.409052 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerName="registry-server" Dec 01 10:27:02 crc kubenswrapper[4763]: E1201 10:27:02.409061 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d118c3-e544-4a7a-ad22-de496e16f9ee" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.409067 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d118c3-e544-4a7a-ad22-de496e16f9ee" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:27:02 crc kubenswrapper[4763]: E1201 10:27:02.409090 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerName="extract-utilities" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.409096 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerName="extract-utilities" Dec 01 10:27:02 crc kubenswrapper[4763]: E1201 10:27:02.409134 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerName="extract-content" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.409139 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerName="extract-content" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.409320 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d118c3-e544-4a7a-ad22-de496e16f9ee" containerName="tempest-tests-tempest-tests-runner" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.409353 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="30423622-4a52-4f6b-84ef-fbc2fe05b1c6" containerName="registry-server" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.409953 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.417482 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kfvgw" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.423555 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.587836 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0ee43b4-efab-48de-ba62-dedd80d83711\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.587959 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplc7\" (UniqueName: \"kubernetes.io/projected/c0ee43b4-efab-48de-ba62-dedd80d83711-kube-api-access-hplc7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0ee43b4-efab-48de-ba62-dedd80d83711\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.689795 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0ee43b4-efab-48de-ba62-dedd80d83711\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.689952 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplc7\" (UniqueName: \"kubernetes.io/projected/c0ee43b4-efab-48de-ba62-dedd80d83711-kube-api-access-hplc7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0ee43b4-efab-48de-ba62-dedd80d83711\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.690764 4763 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0ee43b4-efab-48de-ba62-dedd80d83711\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.708640 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplc7\" (UniqueName: \"kubernetes.io/projected/c0ee43b4-efab-48de-ba62-dedd80d83711-kube-api-access-hplc7\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0ee43b4-efab-48de-ba62-dedd80d83711\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.718157 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c0ee43b4-efab-48de-ba62-dedd80d83711\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:02 crc kubenswrapper[4763]: I1201 10:27:02.733430 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 01 10:27:03 crc kubenswrapper[4763]: I1201 10:27:03.189717 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 01 10:27:03 crc kubenswrapper[4763]: I1201 10:27:03.212736 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:27:03 crc kubenswrapper[4763]: I1201 10:27:03.773070 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c0ee43b4-efab-48de-ba62-dedd80d83711","Type":"ContainerStarted","Data":"f291ff84c563decfdb4a69cce2442a9f7d0a4031c7ec30e6ec525dd622cd8d57"} Dec 01 10:27:03 crc kubenswrapper[4763]: I1201 10:27:03.929267 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:27:03 crc kubenswrapper[4763]: I1201 10:27:03.929344 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:27:03 crc kubenswrapper[4763]: I1201 10:27:03.929395 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 10:27:03 crc kubenswrapper[4763]: I1201 10:27:03.930273 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"721abf0c9085ab0076360747c073c1cfd89cac6d0e20042d7bbd0698eb2ef82e"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:27:03 crc kubenswrapper[4763]: I1201 10:27:03.930347 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://721abf0c9085ab0076360747c073c1cfd89cac6d0e20042d7bbd0698eb2ef82e" gracePeriod=600 Dec 01 10:27:04 crc kubenswrapper[4763]: I1201 10:27:04.785134 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c0ee43b4-efab-48de-ba62-dedd80d83711","Type":"ContainerStarted","Data":"abf8cbd341fbeb79f3b231914b21525061bbd4c59c115fcd3af455003ab007b0"} Dec 01 10:27:04 crc kubenswrapper[4763]: I1201 10:27:04.789589 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="721abf0c9085ab0076360747c073c1cfd89cac6d0e20042d7bbd0698eb2ef82e" exitCode=0 Dec 01 10:27:04 crc kubenswrapper[4763]: I1201 10:27:04.789630 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"721abf0c9085ab0076360747c073c1cfd89cac6d0e20042d7bbd0698eb2ef82e"} Dec 01 10:27:04 crc kubenswrapper[4763]: I1201 10:27:04.789652 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2"} Dec 01 10:27:04 crc kubenswrapper[4763]: I1201 10:27:04.789667 4763 scope.go:117] "RemoveContainer" containerID="dd0914c4e4f62f03197cd9476b6154f872b0a4b9f3ae0c23a336d7eddfb837d6" Dec 01 10:27:04 crc kubenswrapper[4763]: I1201 10:27:04.803401 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.783825045 podStartE2EDuration="2.803384024s" podCreationTimestamp="2025-12-01 10:27:02 +0000 UTC" firstStartedPulling="2025-12-01 10:27:03.212531341 +0000 UTC m=+4340.481180109" lastFinishedPulling="2025-12-01 10:27:04.23209032 +0000 UTC m=+4341.500739088" observedRunningTime="2025-12-01 10:27:04.803280532 +0000 UTC m=+4342.071929300" watchObservedRunningTime="2025-12-01 10:27:04.803384024 +0000 UTC m=+4342.072032792" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.722828 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-prg7h/must-gather-mvdt6"] Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.725774 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.728201 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-prg7h"/"openshift-service-ca.crt" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.728509 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-prg7h"/"default-dockercfg-fqt56" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.744069 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-prg7h"/"kube-root-ca.crt" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.751241 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-prg7h/must-gather-mvdt6"] Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.860363 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/21287b27-72e1-4209-ad3a-412d76cdd1fe-must-gather-output\") pod \"must-gather-mvdt6\" (UID: \"21287b27-72e1-4209-ad3a-412d76cdd1fe\") " pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.860473 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skmmx\" (UniqueName: \"kubernetes.io/projected/21287b27-72e1-4209-ad3a-412d76cdd1fe-kube-api-access-skmmx\") pod \"must-gather-mvdt6\" (UID: \"21287b27-72e1-4209-ad3a-412d76cdd1fe\") " pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.962686 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skmmx\" (UniqueName: \"kubernetes.io/projected/21287b27-72e1-4209-ad3a-412d76cdd1fe-kube-api-access-skmmx\") pod \"must-gather-mvdt6\" (UID: \"21287b27-72e1-4209-ad3a-412d76cdd1fe\") " pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.963142 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/21287b27-72e1-4209-ad3a-412d76cdd1fe-must-gather-output\") pod \"must-gather-mvdt6\" (UID: \"21287b27-72e1-4209-ad3a-412d76cdd1fe\") " pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.963692 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/21287b27-72e1-4209-ad3a-412d76cdd1fe-must-gather-output\") pod \"must-gather-mvdt6\" (UID: \"21287b27-72e1-4209-ad3a-412d76cdd1fe\") " pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:27:31 crc kubenswrapper[4763]: I1201 10:27:31.983299 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skmmx\" (UniqueName: \"kubernetes.io/projected/21287b27-72e1-4209-ad3a-412d76cdd1fe-kube-api-access-skmmx\") pod \"must-gather-mvdt6\" (UID: \"21287b27-72e1-4209-ad3a-412d76cdd1fe\") " pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:27:32 crc kubenswrapper[4763]: I1201 10:27:32.064023 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:27:32 crc kubenswrapper[4763]: I1201 10:27:32.552060 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-prg7h/must-gather-mvdt6"] Dec 01 10:27:33 crc kubenswrapper[4763]: I1201 10:27:33.377579 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/must-gather-mvdt6" event={"ID":"21287b27-72e1-4209-ad3a-412d76cdd1fe","Type":"ContainerStarted","Data":"fa42c874ab8b90fb7772fe0fdb880481f70701705000b1bda6b9d58be8d5ffb5"} Dec 01 10:27:41 crc kubenswrapper[4763]: I1201 10:27:41.472835 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/must-gather-mvdt6" event={"ID":"21287b27-72e1-4209-ad3a-412d76cdd1fe","Type":"ContainerStarted","Data":"e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1"} Dec 01 10:27:41 crc kubenswrapper[4763]: I1201 10:27:41.473489 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/must-gather-mvdt6" event={"ID":"21287b27-72e1-4209-ad3a-412d76cdd1fe","Type":"ContainerStarted","Data":"de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6"} Dec 01 10:27:42 crc kubenswrapper[4763]: I1201 10:27:42.524800 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-prg7h/must-gather-mvdt6" podStartSLOduration=4.889417966 podStartE2EDuration="11.524781144s" podCreationTimestamp="2025-12-01 10:27:31 +0000 UTC" firstStartedPulling="2025-12-01 10:27:32.554673774 +0000 UTC m=+4369.823322542" lastFinishedPulling="2025-12-01 10:27:39.190036942 +0000 UTC m=+4376.458685720" observedRunningTime="2025-12-01 10:27:42.515257448 +0000 UTC m=+4379.783906216" watchObservedRunningTime="2025-12-01 10:27:42.524781144 +0000 UTC m=+4379.793429912" Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.204984 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-prg7h/crc-debug-2vkz2"] Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.206529 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.326825 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e0fa98c-0264-41f1-a5a3-50616bef1c55-host\") pod \"crc-debug-2vkz2\" (UID: \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\") " pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.326936 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bgxq\" (UniqueName: \"kubernetes.io/projected/5e0fa98c-0264-41f1-a5a3-50616bef1c55-kube-api-access-6bgxq\") pod \"crc-debug-2vkz2\" (UID: \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\") " pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.428759 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e0fa98c-0264-41f1-a5a3-50616bef1c55-host\") pod \"crc-debug-2vkz2\" (UID: \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\") " pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.429060 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bgxq\" (UniqueName: \"kubernetes.io/projected/5e0fa98c-0264-41f1-a5a3-50616bef1c55-kube-api-access-6bgxq\") pod \"crc-debug-2vkz2\" (UID: \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\") " pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.428842 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e0fa98c-0264-41f1-a5a3-50616bef1c55-host\") pod \"crc-debug-2vkz2\" (UID: \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\") " pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.455496 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bgxq\" (UniqueName: \"kubernetes.io/projected/5e0fa98c-0264-41f1-a5a3-50616bef1c55-kube-api-access-6bgxq\") pod \"crc-debug-2vkz2\" (UID: \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\") " pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:27:47 crc kubenswrapper[4763]: I1201 10:27:47.525658 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:27:47 crc kubenswrapper[4763]: W1201 10:27:47.572388 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0fa98c_0264_41f1_a5a3_50616bef1c55.slice/crio-77c846bcbe90f26c7b14db9463691cc418e9947945e44bfec81d5231ab754c8e WatchSource:0}: Error finding container 77c846bcbe90f26c7b14db9463691cc418e9947945e44bfec81d5231ab754c8e: Status 404 returned error can't find the container with id 77c846bcbe90f26c7b14db9463691cc418e9947945e44bfec81d5231ab754c8e Dec 01 10:27:48 crc kubenswrapper[4763]: I1201 10:27:48.560291 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/crc-debug-2vkz2" event={"ID":"5e0fa98c-0264-41f1-a5a3-50616bef1c55","Type":"ContainerStarted","Data":"77c846bcbe90f26c7b14db9463691cc418e9947945e44bfec81d5231ab754c8e"} Dec 01 10:27:58 crc kubenswrapper[4763]: I1201 10:27:58.696113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/crc-debug-2vkz2" event={"ID":"5e0fa98c-0264-41f1-a5a3-50616bef1c55","Type":"ContainerStarted","Data":"d26e6af64ece59b47dc8030d3167b91b18d6949e5fe167d0233e9113c29d8cb9"} Dec 01 10:27:58 crc kubenswrapper[4763]: I1201 10:27:58.721194 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-prg7h/crc-debug-2vkz2" podStartSLOduration=1.501753374 podStartE2EDuration="11.721172197s" podCreationTimestamp="2025-12-01 10:27:47 +0000 UTC" firstStartedPulling="2025-12-01 10:27:47.573152501 +0000 UTC m=+4384.841801269" lastFinishedPulling="2025-12-01 10:27:57.792571324 +0000 UTC m=+4395.061220092" observedRunningTime="2025-12-01 10:27:58.711062835 +0000 UTC m=+4395.979711603" watchObservedRunningTime="2025-12-01 10:27:58.721172197 +0000 UTC m=+4395.989820965" Dec 01 10:28:41 crc kubenswrapper[4763]: I1201 10:28:41.059330 4763 generic.go:334] "Generic (PLEG): container finished" podID="5e0fa98c-0264-41f1-a5a3-50616bef1c55" containerID="d26e6af64ece59b47dc8030d3167b91b18d6949e5fe167d0233e9113c29d8cb9" exitCode=0 Dec 01 10:28:41 crc kubenswrapper[4763]: I1201 10:28:41.059413 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/crc-debug-2vkz2" event={"ID":"5e0fa98c-0264-41f1-a5a3-50616bef1c55","Type":"ContainerDied","Data":"d26e6af64ece59b47dc8030d3167b91b18d6949e5fe167d0233e9113c29d8cb9"} Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.172066 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.207773 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-prg7h/crc-debug-2vkz2"] Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.219433 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-prg7h/crc-debug-2vkz2"] Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.276044 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bgxq\" (UniqueName: \"kubernetes.io/projected/5e0fa98c-0264-41f1-a5a3-50616bef1c55-kube-api-access-6bgxq\") pod \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\" (UID: \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\") " Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.276288 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e0fa98c-0264-41f1-a5a3-50616bef1c55-host\") pod \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\" (UID: \"5e0fa98c-0264-41f1-a5a3-50616bef1c55\") " Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.276827 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e0fa98c-0264-41f1-a5a3-50616bef1c55-host" (OuterVolumeSpecName: "host") pod "5e0fa98c-0264-41f1-a5a3-50616bef1c55" (UID: "5e0fa98c-0264-41f1-a5a3-50616bef1c55"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.282553 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0fa98c-0264-41f1-a5a3-50616bef1c55-kube-api-access-6bgxq" (OuterVolumeSpecName: "kube-api-access-6bgxq") pod "5e0fa98c-0264-41f1-a5a3-50616bef1c55" (UID: "5e0fa98c-0264-41f1-a5a3-50616bef1c55"). InnerVolumeSpecName "kube-api-access-6bgxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.379024 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e0fa98c-0264-41f1-a5a3-50616bef1c55-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:42 crc kubenswrapper[4763]: I1201 10:28:42.379062 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bgxq\" (UniqueName: \"kubernetes.io/projected/5e0fa98c-0264-41f1-a5a3-50616bef1c55-kube-api-access-6bgxq\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.010642 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0fa98c-0264-41f1-a5a3-50616bef1c55" path="/var/lib/kubelet/pods/5e0fa98c-0264-41f1-a5a3-50616bef1c55/volumes" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.075478 4763 scope.go:117] "RemoveContainer" containerID="d26e6af64ece59b47dc8030d3167b91b18d6949e5fe167d0233e9113c29d8cb9" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.075654 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-2vkz2" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.513947 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-prg7h/crc-debug-kbrtb"] Dec 01 10:28:43 crc kubenswrapper[4763]: E1201 10:28:43.514327 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0fa98c-0264-41f1-a5a3-50616bef1c55" containerName="container-00" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.514341 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0fa98c-0264-41f1-a5a3-50616bef1c55" containerName="container-00" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.514544 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0fa98c-0264-41f1-a5a3-50616bef1c55" containerName="container-00" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.515172 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.603300 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b3a8-3533-4516-99a7-b55a7516b025-host\") pod \"crc-debug-kbrtb\" (UID: \"37d2b3a8-3533-4516-99a7-b55a7516b025\") " pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.603669 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz8q6\" (UniqueName: \"kubernetes.io/projected/37d2b3a8-3533-4516-99a7-b55a7516b025-kube-api-access-qz8q6\") pod \"crc-debug-kbrtb\" (UID: \"37d2b3a8-3533-4516-99a7-b55a7516b025\") " pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.706023 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz8q6\" (UniqueName: \"kubernetes.io/projected/37d2b3a8-3533-4516-99a7-b55a7516b025-kube-api-access-qz8q6\") pod \"crc-debug-kbrtb\" (UID: \"37d2b3a8-3533-4516-99a7-b55a7516b025\") " pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.706484 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b3a8-3533-4516-99a7-b55a7516b025-host\") pod \"crc-debug-kbrtb\" (UID: \"37d2b3a8-3533-4516-99a7-b55a7516b025\") " pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.706567 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b3a8-3533-4516-99a7-b55a7516b025-host\") pod \"crc-debug-kbrtb\" (UID: \"37d2b3a8-3533-4516-99a7-b55a7516b025\") " pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.723854 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz8q6\" (UniqueName: \"kubernetes.io/projected/37d2b3a8-3533-4516-99a7-b55a7516b025-kube-api-access-qz8q6\") pod \"crc-debug-kbrtb\" (UID: \"37d2b3a8-3533-4516-99a7-b55a7516b025\") " pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:43 crc kubenswrapper[4763]: I1201 10:28:43.840471 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:44 crc kubenswrapper[4763]: I1201 10:28:44.087333 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/crc-debug-kbrtb" event={"ID":"37d2b3a8-3533-4516-99a7-b55a7516b025","Type":"ContainerStarted","Data":"e61a776e020ea9f3af027bcc71383c14a2f201ff8e2203017684f0da49ccba3e"} Dec 01 10:28:45 crc kubenswrapper[4763]: I1201 10:28:45.097986 4763 generic.go:334] "Generic (PLEG): container finished" podID="37d2b3a8-3533-4516-99a7-b55a7516b025" containerID="38a5d5e77ef538144915ec98943e139291f6f18e2624b9eb52200cff5b002979" exitCode=0 Dec 01 10:28:45 crc kubenswrapper[4763]: I1201 10:28:45.098059 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/crc-debug-kbrtb" event={"ID":"37d2b3a8-3533-4516-99a7-b55a7516b025","Type":"ContainerDied","Data":"38a5d5e77ef538144915ec98943e139291f6f18e2624b9eb52200cff5b002979"} Dec 01 10:28:45 crc kubenswrapper[4763]: I1201 10:28:45.510067 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-prg7h/crc-debug-kbrtb"] Dec 01 10:28:45 crc kubenswrapper[4763]: I1201 10:28:45.520369 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-prg7h/crc-debug-kbrtb"] Dec 01 10:28:46 crc kubenswrapper[4763]: I1201 10:28:46.756936 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:46 crc kubenswrapper[4763]: I1201 10:28:46.867359 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz8q6\" (UniqueName: \"kubernetes.io/projected/37d2b3a8-3533-4516-99a7-b55a7516b025-kube-api-access-qz8q6\") pod \"37d2b3a8-3533-4516-99a7-b55a7516b025\" (UID: \"37d2b3a8-3533-4516-99a7-b55a7516b025\") " Dec 01 10:28:46 crc kubenswrapper[4763]: I1201 10:28:46.867721 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b3a8-3533-4516-99a7-b55a7516b025-host\") pod \"37d2b3a8-3533-4516-99a7-b55a7516b025\" (UID: \"37d2b3a8-3533-4516-99a7-b55a7516b025\") " Dec 01 10:28:46 crc kubenswrapper[4763]: I1201 10:28:46.868571 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37d2b3a8-3533-4516-99a7-b55a7516b025-host" (OuterVolumeSpecName: "host") pod "37d2b3a8-3533-4516-99a7-b55a7516b025" (UID: "37d2b3a8-3533-4516-99a7-b55a7516b025"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:28:46 crc kubenswrapper[4763]: I1201 10:28:46.876697 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d2b3a8-3533-4516-99a7-b55a7516b025-kube-api-access-qz8q6" (OuterVolumeSpecName: "kube-api-access-qz8q6") pod "37d2b3a8-3533-4516-99a7-b55a7516b025" (UID: "37d2b3a8-3533-4516-99a7-b55a7516b025"). InnerVolumeSpecName "kube-api-access-qz8q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:28:46 crc kubenswrapper[4763]: I1201 10:28:46.970815 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz8q6\" (UniqueName: \"kubernetes.io/projected/37d2b3a8-3533-4516-99a7-b55a7516b025-kube-api-access-qz8q6\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:46 crc kubenswrapper[4763]: I1201 10:28:46.971090 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b3a8-3533-4516-99a7-b55a7516b025-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.004737 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d2b3a8-3533-4516-99a7-b55a7516b025" path="/var/lib/kubelet/pods/37d2b3a8-3533-4516-99a7-b55a7516b025/volumes" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.112867 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-prg7h/crc-debug-rmbc7"] Dec 01 10:28:47 crc kubenswrapper[4763]: E1201 10:28:47.113554 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d2b3a8-3533-4516-99a7-b55a7516b025" containerName="container-00" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.113583 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d2b3a8-3533-4516-99a7-b55a7516b025" containerName="container-00" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.113845 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d2b3a8-3533-4516-99a7-b55a7516b025" containerName="container-00" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.114749 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.120840 4763 scope.go:117] "RemoveContainer" containerID="38a5d5e77ef538144915ec98943e139291f6f18e2624b9eb52200cff5b002979" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.120877 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-kbrtb" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.174409 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1012af86-ae5f-4231-91fb-debff875796d-host\") pod \"crc-debug-rmbc7\" (UID: \"1012af86-ae5f-4231-91fb-debff875796d\") " pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.174846 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjh8w\" (UniqueName: \"kubernetes.io/projected/1012af86-ae5f-4231-91fb-debff875796d-kube-api-access-rjh8w\") pod \"crc-debug-rmbc7\" (UID: \"1012af86-ae5f-4231-91fb-debff875796d\") " pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.276644 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjh8w\" (UniqueName: \"kubernetes.io/projected/1012af86-ae5f-4231-91fb-debff875796d-kube-api-access-rjh8w\") pod \"crc-debug-rmbc7\" (UID: \"1012af86-ae5f-4231-91fb-debff875796d\") " pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.276816 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1012af86-ae5f-4231-91fb-debff875796d-host\") pod \"crc-debug-rmbc7\" (UID: \"1012af86-ae5f-4231-91fb-debff875796d\") " pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.276923 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1012af86-ae5f-4231-91fb-debff875796d-host\") pod \"crc-debug-rmbc7\" (UID: \"1012af86-ae5f-4231-91fb-debff875796d\") " pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.294898 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjh8w\" (UniqueName: \"kubernetes.io/projected/1012af86-ae5f-4231-91fb-debff875796d-kube-api-access-rjh8w\") pod \"crc-debug-rmbc7\" (UID: \"1012af86-ae5f-4231-91fb-debff875796d\") " pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:47 crc kubenswrapper[4763]: I1201 10:28:47.441030 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:47 crc kubenswrapper[4763]: W1201 10:28:47.488960 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1012af86_ae5f_4231_91fb_debff875796d.slice/crio-edaebaa31f52fcf21aae4681552d54f9f25668c6566fe281c6ae04153f2a03fb WatchSource:0}: Error finding container edaebaa31f52fcf21aae4681552d54f9f25668c6566fe281c6ae04153f2a03fb: Status 404 returned error can't find the container with id edaebaa31f52fcf21aae4681552d54f9f25668c6566fe281c6ae04153f2a03fb Dec 01 10:28:48 crc kubenswrapper[4763]: I1201 10:28:48.132838 4763 generic.go:334] "Generic (PLEG): container finished" podID="1012af86-ae5f-4231-91fb-debff875796d" containerID="9330f816d2e0fd67f97f4e4385e7b84a515c3b4cdc75804f22b2369c64b88941" exitCode=0 Dec 01 10:28:48 crc kubenswrapper[4763]: I1201 10:28:48.132897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/crc-debug-rmbc7" event={"ID":"1012af86-ae5f-4231-91fb-debff875796d","Type":"ContainerDied","Data":"9330f816d2e0fd67f97f4e4385e7b84a515c3b4cdc75804f22b2369c64b88941"} Dec 01 10:28:48 crc kubenswrapper[4763]: I1201 10:28:48.133376 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/crc-debug-rmbc7" event={"ID":"1012af86-ae5f-4231-91fb-debff875796d","Type":"ContainerStarted","Data":"edaebaa31f52fcf21aae4681552d54f9f25668c6566fe281c6ae04153f2a03fb"} Dec 01 10:28:48 crc kubenswrapper[4763]: I1201 10:28:48.170719 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-prg7h/crc-debug-rmbc7"] Dec 01 10:28:48 crc kubenswrapper[4763]: I1201 10:28:48.181080 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-prg7h/crc-debug-rmbc7"] Dec 01 10:28:49 crc kubenswrapper[4763]: I1201 10:28:49.247049 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:49 crc kubenswrapper[4763]: I1201 10:28:49.320141 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjh8w\" (UniqueName: \"kubernetes.io/projected/1012af86-ae5f-4231-91fb-debff875796d-kube-api-access-rjh8w\") pod \"1012af86-ae5f-4231-91fb-debff875796d\" (UID: \"1012af86-ae5f-4231-91fb-debff875796d\") " Dec 01 10:28:49 crc kubenswrapper[4763]: I1201 10:28:49.320349 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1012af86-ae5f-4231-91fb-debff875796d-host\") pod \"1012af86-ae5f-4231-91fb-debff875796d\" (UID: \"1012af86-ae5f-4231-91fb-debff875796d\") " Dec 01 10:28:49 crc kubenswrapper[4763]: I1201 10:28:49.320588 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1012af86-ae5f-4231-91fb-debff875796d-host" (OuterVolumeSpecName: "host") pod "1012af86-ae5f-4231-91fb-debff875796d" (UID: "1012af86-ae5f-4231-91fb-debff875796d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:28:49 crc kubenswrapper[4763]: I1201 10:28:49.321224 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1012af86-ae5f-4231-91fb-debff875796d-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:49 crc kubenswrapper[4763]: I1201 10:28:49.328780 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1012af86-ae5f-4231-91fb-debff875796d-kube-api-access-rjh8w" (OuterVolumeSpecName: "kube-api-access-rjh8w") pod "1012af86-ae5f-4231-91fb-debff875796d" (UID: "1012af86-ae5f-4231-91fb-debff875796d"). InnerVolumeSpecName "kube-api-access-rjh8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:28:49 crc kubenswrapper[4763]: I1201 10:28:49.422918 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjh8w\" (UniqueName: \"kubernetes.io/projected/1012af86-ae5f-4231-91fb-debff875796d-kube-api-access-rjh8w\") on node \"crc\" DevicePath \"\"" Dec 01 10:28:50 crc kubenswrapper[4763]: I1201 10:28:50.151832 4763 scope.go:117] "RemoveContainer" containerID="9330f816d2e0fd67f97f4e4385e7b84a515c3b4cdc75804f22b2369c64b88941" Dec 01 10:28:50 crc kubenswrapper[4763]: I1201 10:28:50.151836 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/crc-debug-rmbc7" Dec 01 10:28:50 crc kubenswrapper[4763]: E1201 10:28:50.207030 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0fa98c_0264_41f1_a5a3_50616bef1c55.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:28:51 crc kubenswrapper[4763]: I1201 10:28:51.005272 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1012af86-ae5f-4231-91fb-debff875796d" path="/var/lib/kubelet/pods/1012af86-ae5f-4231-91fb-debff875796d/volumes" Dec 01 10:29:00 crc kubenswrapper[4763]: E1201 10:29:00.476026 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0fa98c_0264_41f1_a5a3_50616bef1c55.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:29:10 crc kubenswrapper[4763]: E1201 10:29:10.752572 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0fa98c_0264_41f1_a5a3_50616bef1c55.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:29:21 crc kubenswrapper[4763]: E1201 10:29:21.055609 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0fa98c_0264_41f1_a5a3_50616bef1c55.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:29:31 crc kubenswrapper[4763]: E1201 10:29:31.328292 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0fa98c_0264_41f1_a5a3_50616bef1c55.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:29:33 crc kubenswrapper[4763]: I1201 10:29:33.929422 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:29:33 crc kubenswrapper[4763]: I1201 10:29:33.930645 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:29:41 crc kubenswrapper[4763]: E1201 10:29:41.590066 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0fa98c_0264_41f1_a5a3_50616bef1c55.slice\": RecentStats: unable to find data in memory cache]" Dec 01 10:29:47 crc kubenswrapper[4763]: I1201 10:29:47.961033 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d56769fd6-btjcl_6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2/barbican-api/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.098201 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d56769fd6-btjcl_6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2/barbican-api-log/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.188695 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7db8c7446b-kkdjv_2a3e2134-8fc4-4ec2-9970-92959ed1778e/barbican-keystone-listener/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.212243 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7db8c7446b-kkdjv_2a3e2134-8fc4-4ec2-9970-92959ed1778e/barbican-keystone-listener-log/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.467927 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6948ddcbd7-94xjg_9dac2a9d-f2f8-4167-8f68-f01c9364a59f/barbican-worker-log/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.520012 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6948ddcbd7-94xjg_9dac2a9d-f2f8-4167-8f68-f01c9364a59f/barbican-worker/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.667660 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj_249234f7-8f79-4a99-a35b-d43677150bf6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.775446 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fcb70e3b-87a0-49ff-8946-e182808bf846/ceilometer-notification-agent/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.839450 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fcb70e3b-87a0-49ff-8946-e182808bf846/ceilometer-central-agent/0.log" Dec 01 10:29:48 crc kubenswrapper[4763]: I1201 10:29:48.890683 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fcb70e3b-87a0-49ff-8946-e182808bf846/proxy-httpd/0.log" Dec 01 10:29:49 crc kubenswrapper[4763]: I1201 10:29:49.023642 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fcb70e3b-87a0-49ff-8946-e182808bf846/sg-core/0.log" Dec 01 10:29:49 crc kubenswrapper[4763]: I1201 10:29:49.034598 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc_e060abda-70ed-4adb-8756-15046c2a2f9d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:49 crc kubenswrapper[4763]: I1201 10:29:49.302132 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_054a3443-a215-4987-993e-ea2d282c1d0d/cinder-api/0.log" Dec 01 10:29:49 crc kubenswrapper[4763]: I1201 10:29:49.324317 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp_93089553-1487-4d3b-ab46-1cd7822aa6ad/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:49 crc kubenswrapper[4763]: I1201 10:29:49.417441 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_054a3443-a215-4987-993e-ea2d282c1d0d/cinder-api-log/0.log" Dec 01 10:29:49 crc kubenswrapper[4763]: I1201 10:29:49.651273 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_60bdb953-6735-4247-8287-16dbf4187c03/cinder-backup/0.log" Dec 01 10:29:49 crc kubenswrapper[4763]: I1201 10:29:49.672839 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_60bdb953-6735-4247-8287-16dbf4187c03/probe/0.log" Dec 01 10:29:49 crc kubenswrapper[4763]: I1201 10:29:49.864200 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e7475a3a-70b9-44d0-94b2-3c3890185f85/cinder-scheduler/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.003776 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e7475a3a-70b9-44d0-94b2-3c3890185f85/probe/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.014634 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_1a72cdbd-892b-459d-86e2-0dde31be5e39/cinder-volume/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.125728 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_1a72cdbd-892b-459d-86e2-0dde31be5e39/probe/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.239691 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7_9076b862-2a04-47bc-a6f6-bb99cd48ec2b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.418444 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw_66848849-497d-4488-898a-c529d1ef2736/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.531586 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c846ff5b9-2xtfs_34405493-5281-4822-b8f1-11e68aa61470/init/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.830032 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c846ff5b9-2xtfs_34405493-5281-4822-b8f1-11e68aa61470/init/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.927599 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f34e35d3-3a43-49ee-bee1-9ccc51135eb7/glance-httpd/0.log" Dec 01 10:29:50 crc kubenswrapper[4763]: I1201 10:29:50.935650 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c846ff5b9-2xtfs_34405493-5281-4822-b8f1-11e68aa61470/dnsmasq-dns/0.log" Dec 01 10:29:51 crc kubenswrapper[4763]: I1201 10:29:51.140382 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f34e35d3-3a43-49ee-bee1-9ccc51135eb7/glance-log/0.log" Dec 01 10:29:51 crc kubenswrapper[4763]: I1201 10:29:51.230817 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78/glance-httpd/0.log" Dec 01 10:29:51 crc kubenswrapper[4763]: I1201 10:29:51.244077 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78/glance-log/0.log" Dec 01 10:29:51 crc kubenswrapper[4763]: I1201 10:29:51.529861 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c998bbd96-rw26q_28f61e56-22f7-45cc-ba59-9624e668a73d/horizon/0.log" Dec 01 10:29:51 crc kubenswrapper[4763]: I1201 10:29:51.566682 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c998bbd96-rw26q_28f61e56-22f7-45cc-ba59-9624e668a73d/horizon-log/0.log" Dec 01 10:29:51 crc kubenswrapper[4763]: I1201 10:29:51.593144 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c_7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:51 crc kubenswrapper[4763]: I1201 10:29:51.881724 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-c97lf_feb1b8da-b40b-439e-a27e-3f78045bbf86/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.022349 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7cddbbbc75-sb8z7_f503ba10-c7e4-4615-9137-8138e0dfb3f9/keystone-api/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.081156 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409721-pwb6r_80866302-0e51-4cd8-8fb1-5106a5764fb8/keystone-cron/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.102925 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_22ccd9e5-e651-4111-adfa-853c6d838d96/kube-state-metrics/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.274601 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl_a458d267-3663-4b9e-baa3-c3711a334c80/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.476187 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6730d033-63cf-46f2-b779-e751663b7735/manila-api/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.519191 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6730d033-63cf-46f2-b779-e751663b7735/manila-api-log/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.563142 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_3cf5f5cb-a8a0-40f4-8acc-1f41415052d2/manila-scheduler/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.648514 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_3cf5f5cb-a8a0-40f4-8acc-1f41415052d2/probe/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.821784 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_18c3fc4b-5681-40d1-8b65-e85af0a1905e/manila-share/0.log" Dec 01 10:29:52 crc kubenswrapper[4763]: I1201 10:29:52.890949 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_18c3fc4b-5681-40d1-8b65-e85af0a1905e/probe/0.log" Dec 01 10:29:53 crc kubenswrapper[4763]: I1201 10:29:53.228579 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54896b4dfc-stxgl_e30619d4-84f7-4a40-aca1-b6885d608e03/neutron-api/0.log" Dec 01 10:29:53 crc kubenswrapper[4763]: I1201 10:29:53.286950 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54896b4dfc-stxgl_e30619d4-84f7-4a40-aca1-b6885d608e03/neutron-httpd/0.log" Dec 01 10:29:53 crc kubenswrapper[4763]: I1201 10:29:53.439592 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk_a6949d90-ef2d-4555-87b8-0929fd2048b4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:53 crc kubenswrapper[4763]: I1201 10:29:53.958096 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5c652d84-294e-4f79-bbcd-37fca6657cd6/nova-api-log/0.log" Dec 01 10:29:54 crc kubenswrapper[4763]: I1201 10:29:54.229821 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a0b95396-e7dd-4b49-b465-db158816b7ea/nova-cell0-conductor-conductor/0.log" Dec 01 10:29:54 crc kubenswrapper[4763]: I1201 10:29:54.368505 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1004fbb5-ee5c-4328-bb1f-9054e4224138/nova-cell1-conductor-conductor/0.log" Dec 01 10:29:54 crc kubenswrapper[4763]: I1201 10:29:54.389965 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5c652d84-294e-4f79-bbcd-37fca6657cd6/nova-api-api/0.log" Dec 01 10:29:54 crc kubenswrapper[4763]: I1201 10:29:54.584052 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_34e5c16d-5a9f-43f0-a2ba-ca4a768891a7/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 10:29:54 crc kubenswrapper[4763]: I1201 10:29:54.709097 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk_43b000e9-21e9-47f9-8bc7-a93a8747159e/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:55 crc kubenswrapper[4763]: I1201 10:29:55.263193 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b10d5a09-4c44-4fb9-bcc0-b04612dde39c/nova-metadata-log/0.log" Dec 01 10:29:55 crc kubenswrapper[4763]: I1201 10:29:55.988025 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e/mysql-bootstrap/0.log" Dec 01 10:29:56 crc kubenswrapper[4763]: I1201 10:29:56.107756 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_002374b8-d55c-4996-9fb9-0e4fc758dc7f/nova-scheduler-scheduler/0.log" Dec 01 10:29:56 crc kubenswrapper[4763]: I1201 10:29:56.306766 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e/mysql-bootstrap/0.log" Dec 01 10:29:56 crc kubenswrapper[4763]: I1201 10:29:56.352612 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e/galera/0.log" Dec 01 10:29:56 crc kubenswrapper[4763]: I1201 10:29:56.727340 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fc05c35a-b504-4104-a515-737272f6b4d9/mysql-bootstrap/0.log" Dec 01 10:29:57 crc kubenswrapper[4763]: I1201 10:29:57.027987 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fc05c35a-b504-4104-a515-737272f6b4d9/galera/0.log" Dec 01 10:29:57 crc kubenswrapper[4763]: I1201 10:29:57.060722 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fc05c35a-b504-4104-a515-737272f6b4d9/mysql-bootstrap/0.log" Dec 01 10:29:57 crc kubenswrapper[4763]: I1201 10:29:57.087635 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b10d5a09-4c44-4fb9-bcc0-b04612dde39c/nova-metadata-metadata/0.log" Dec 01 10:29:57 crc kubenswrapper[4763]: I1201 10:29:57.244551 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b22eceb2-ee23-4a6c-a993-bb280fe2d41f/openstackclient/0.log" Dec 01 10:29:57 crc kubenswrapper[4763]: I1201 10:29:57.385430 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-26n5d_68a1c130-7d5e-4679-9ec7-dd63b84cc8d5/ovn-controller/0.log" Dec 01 10:29:57 crc kubenswrapper[4763]: I1201 10:29:57.481947 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4p79n_065901e9-140d-472b-8ed7-6e800f992c73/openstack-network-exporter/0.log" Dec 01 10:29:57 crc kubenswrapper[4763]: I1201 10:29:57.653866 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d2z4q_ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7/ovsdb-server-init/0.log" Dec 01 10:29:57 crc kubenswrapper[4763]: I1201 10:29:57.994595 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d2z4q_ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7/ovsdb-server-init/0.log" Dec 01 10:29:58 crc kubenswrapper[4763]: I1201 10:29:58.003644 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d2z4q_ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7/ovs-vswitchd/0.log" Dec 01 10:29:58 crc kubenswrapper[4763]: I1201 10:29:58.131150 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d2z4q_ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7/ovsdb-server/0.log" Dec 01 10:29:58 crc kubenswrapper[4763]: I1201 10:29:58.326747 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7d82w_be3fab30-e99d-4b1a-ba2c-86326fbeb363/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:29:58 crc kubenswrapper[4763]: I1201 10:29:58.444777 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc/openstack-network-exporter/0.log" Dec 01 10:29:58 crc kubenswrapper[4763]: I1201 10:29:58.449307 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc/ovn-northd/0.log" Dec 01 10:29:58 crc kubenswrapper[4763]: I1201 10:29:58.739033 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fe31b02-1b96-439f-bc58-9d2d2700d35b/openstack-network-exporter/0.log" Dec 01 10:29:58 crc kubenswrapper[4763]: I1201 10:29:58.749469 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fe31b02-1b96-439f-bc58-9d2d2700d35b/ovsdbserver-nb/0.log" Dec 01 10:29:58 crc kubenswrapper[4763]: I1201 10:29:58.998671 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2aab79d8-d046-4877-9fa0-12d87132a99f/openstack-network-exporter/0.log" Dec 01 10:29:59 crc kubenswrapper[4763]: I1201 10:29:59.027776 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2aab79d8-d046-4877-9fa0-12d87132a99f/ovsdbserver-sb/0.log" Dec 01 10:29:59 crc kubenswrapper[4763]: I1201 10:29:59.171922 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bbd79555b-kk8vr_cdaab12f-7433-420e-bdf0-99ef2e2f5707/placement-api/0.log" Dec 01 10:29:59 crc kubenswrapper[4763]: I1201 10:29:59.401541 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bbd79555b-kk8vr_cdaab12f-7433-420e-bdf0-99ef2e2f5707/placement-log/0.log" Dec 01 10:29:59 crc kubenswrapper[4763]: I1201 10:29:59.431106 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d10e5ae-f63a-4bdf-b3f5-2f99e6856799/setup-container/0.log" Dec 01 10:29:59 crc kubenswrapper[4763]: I1201 10:29:59.683596 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d10e5ae-f63a-4bdf-b3f5-2f99e6856799/rabbitmq/0.log" Dec 01 10:29:59 crc kubenswrapper[4763]: I1201 10:29:59.709320 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d10e5ae-f63a-4bdf-b3f5-2f99e6856799/setup-container/0.log" Dec 01 10:29:59 crc kubenswrapper[4763]: I1201 10:29:59.797101 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6051a720-a09e-4c11-a9c4-465be3730f65/setup-container/0.log" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.186080 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf"] Dec 01 10:30:00 crc kubenswrapper[4763]: E1201 10:30:00.186525 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1012af86-ae5f-4231-91fb-debff875796d" containerName="container-00" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.186538 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1012af86-ae5f-4231-91fb-debff875796d" containerName="container-00" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.186727 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1012af86-ae5f-4231-91fb-debff875796d" containerName="container-00" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.187312 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.189911 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.190121 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.197299 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf"] Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.304091 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6051a720-a09e-4c11-a9c4-465be3730f65/setup-container/0.log" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.313921 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6051a720-a09e-4c11-a9c4-465be3730f65/rabbitmq/0.log" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.371253 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1db81a5-b618-460e-bff5-772c9bc8096f-secret-volume\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.371324 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j6ct\" (UniqueName: \"kubernetes.io/projected/d1db81a5-b618-460e-bff5-772c9bc8096f-kube-api-access-6j6ct\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.371344 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1db81a5-b618-460e-bff5-772c9bc8096f-config-volume\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.412667 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx_2308a6d4-3af7-4772-8413-3803ac516e1c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.472589 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1db81a5-b618-460e-bff5-772c9bc8096f-secret-volume\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.472655 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j6ct\" (UniqueName: \"kubernetes.io/projected/d1db81a5-b618-460e-bff5-772c9bc8096f-kube-api-access-6j6ct\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.472673 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1db81a5-b618-460e-bff5-772c9bc8096f-config-volume\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.473407 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1db81a5-b618-460e-bff5-772c9bc8096f-config-volume\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.490892 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j6ct\" (UniqueName: \"kubernetes.io/projected/d1db81a5-b618-460e-bff5-772c9bc8096f-kube-api-access-6j6ct\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.491519 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1db81a5-b618-460e-bff5-772c9bc8096f-secret-volume\") pod \"collect-profiles-29409750-nt6kf\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.536158 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:00 crc kubenswrapper[4763]: I1201 10:30:00.748134 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm_09695b88-6f6f-469a-b41a-02cd50e1f216/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.081339 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf"] Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.103786 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zbl9m_b41d9a03-82fb-4f14-b13c-0437ae28a1a7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.462203 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-q84bf_57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb/ssh-known-hosts-edpm-deployment/0.log" Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.479127 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_74d118c3-e544-4a7a-ad22-de496e16f9ee/tempest-tests-tempest-tests-runner/0.log" Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.634313 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c0ee43b4-efab-48de-ba62-dedd80d83711/test-operator-logs-container/0.log" Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.825708 4763 generic.go:334] "Generic (PLEG): container finished" podID="d1db81a5-b618-460e-bff5-772c9bc8096f" containerID="5d4ef04d2bf66615d6dc23e28c270779e003133f0e467c29393d092444170c86" exitCode=0 Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.825762 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" event={"ID":"d1db81a5-b618-460e-bff5-772c9bc8096f","Type":"ContainerDied","Data":"5d4ef04d2bf66615d6dc23e28c270779e003133f0e467c29393d092444170c86"} Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.825797 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" event={"ID":"d1db81a5-b618-460e-bff5-772c9bc8096f","Type":"ContainerStarted","Data":"e57b29bbc81e679345758c35f399efda2cfd80c9435b3b775d70ff9a8a01b7db"} Dec 01 10:30:01 crc kubenswrapper[4763]: I1201 10:30:01.900127 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv_0fee9e86-e1af-4201-9817-bf22f5910477/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.384130 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.558971 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1db81a5-b618-460e-bff5-772c9bc8096f-config-volume\") pod \"d1db81a5-b618-460e-bff5-772c9bc8096f\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.559413 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1db81a5-b618-460e-bff5-772c9bc8096f-secret-volume\") pod \"d1db81a5-b618-460e-bff5-772c9bc8096f\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.559706 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1db81a5-b618-460e-bff5-772c9bc8096f-config-volume" (OuterVolumeSpecName: "config-volume") pod "d1db81a5-b618-460e-bff5-772c9bc8096f" (UID: "d1db81a5-b618-460e-bff5-772c9bc8096f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.561566 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j6ct\" (UniqueName: \"kubernetes.io/projected/d1db81a5-b618-460e-bff5-772c9bc8096f-kube-api-access-6j6ct\") pod \"d1db81a5-b618-460e-bff5-772c9bc8096f\" (UID: \"d1db81a5-b618-460e-bff5-772c9bc8096f\") " Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.567246 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1db81a5-b618-460e-bff5-772c9bc8096f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.566428 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1db81a5-b618-460e-bff5-772c9bc8096f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d1db81a5-b618-460e-bff5-772c9bc8096f" (UID: "d1db81a5-b618-460e-bff5-772c9bc8096f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.570732 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1db81a5-b618-460e-bff5-772c9bc8096f-kube-api-access-6j6ct" (OuterVolumeSpecName: "kube-api-access-6j6ct") pod "d1db81a5-b618-460e-bff5-772c9bc8096f" (UID: "d1db81a5-b618-460e-bff5-772c9bc8096f"). InnerVolumeSpecName "kube-api-access-6j6ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.673039 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1db81a5-b618-460e-bff5-772c9bc8096f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.673080 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j6ct\" (UniqueName: \"kubernetes.io/projected/d1db81a5-b618-460e-bff5-772c9bc8096f-kube-api-access-6j6ct\") on node \"crc\" DevicePath \"\"" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.860897 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" event={"ID":"d1db81a5-b618-460e-bff5-772c9bc8096f","Type":"ContainerDied","Data":"e57b29bbc81e679345758c35f399efda2cfd80c9435b3b775d70ff9a8a01b7db"} Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.861199 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e57b29bbc81e679345758c35f399efda2cfd80c9435b3b775d70ff9a8a01b7db" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.860959 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409750-nt6kf" Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.928950 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:30:03 crc kubenswrapper[4763]: I1201 10:30:03.929007 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:30:04 crc kubenswrapper[4763]: I1201 10:30:04.527792 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt"] Dec 01 10:30:04 crc kubenswrapper[4763]: I1201 10:30:04.541832 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409705-z29tt"] Dec 01 10:30:05 crc kubenswrapper[4763]: I1201 10:30:05.034700 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22700f51-62ff-4238-9696-72f1446882f0" path="/var/lib/kubelet/pods/22700f51-62ff-4238-9696-72f1446882f0/volumes" Dec 01 10:30:09 crc kubenswrapper[4763]: I1201 10:30:09.854757 4763 scope.go:117] "RemoveContainer" containerID="a7a96ec71d3abef637dab6440cbf0a9db6f83af1bfec3ed5c8a9ec00b6a4d184" Dec 01 10:30:13 crc kubenswrapper[4763]: I1201 10:30:13.533579 4763 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-hc4lf" podUID="a09b38c8-91a6-45ed-b97b-d0370e99ab11" containerName="registry-server" probeResult="failure" output=< Dec 01 10:30:13 crc kubenswrapper[4763]: timeout: health rpc did not complete within 1s Dec 01 10:30:13 crc kubenswrapper[4763]: > Dec 01 10:30:15 crc kubenswrapper[4763]: I1201 10:30:15.679082 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f17760ee-44e7-4bf5-b9d6-368f9b780426/memcached/0.log" Dec 01 10:30:33 crc kubenswrapper[4763]: I1201 10:30:33.929158 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:30:33 crc kubenswrapper[4763]: I1201 10:30:33.929808 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:30:33 crc kubenswrapper[4763]: I1201 10:30:33.929859 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 10:30:33 crc kubenswrapper[4763]: I1201 10:30:33.930677 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:30:33 crc kubenswrapper[4763]: I1201 10:30:33.930749 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" gracePeriod=600 Dec 01 10:30:34 crc kubenswrapper[4763]: E1201 10:30:34.058677 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:30:34 crc kubenswrapper[4763]: I1201 10:30:34.147713 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" exitCode=0 Dec 01 10:30:34 crc kubenswrapper[4763]: I1201 10:30:34.147768 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2"} Dec 01 10:30:34 crc kubenswrapper[4763]: I1201 10:30:34.147803 4763 scope.go:117] "RemoveContainer" containerID="721abf0c9085ab0076360747c073c1cfd89cac6d0e20042d7bbd0698eb2ef82e" Dec 01 10:30:34 crc kubenswrapper[4763]: I1201 10:30:34.148542 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:30:34 crc kubenswrapper[4763]: E1201 10:30:34.148828 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.061489 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/util/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.228416 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/pull/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.279399 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/util/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.301194 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/pull/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.481723 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/util/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.561440 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/extract/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.625147 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/pull/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.687553 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qh7dr_1208b653-3551-4266-99b9-e83fb86b4771/kube-rbac-proxy/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.850347 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qh7dr_1208b653-3551-4266-99b9-e83fb86b4771/manager/0.log" Dec 01 10:30:35 crc kubenswrapper[4763]: I1201 10:30:35.899677 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-nh5mz_1c980f4b-c55c-4a2e-9461-9f89ec0165c3/kube-rbac-proxy/0.log" Dec 01 10:30:36 crc kubenswrapper[4763]: I1201 10:30:36.018622 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-nh5mz_1c980f4b-c55c-4a2e-9461-9f89ec0165c3/manager/0.log" Dec 01 10:30:36 crc kubenswrapper[4763]: I1201 10:30:36.144159 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-l547d_ee716572-1b36-4216-84a6-ad3f4ac2b7f6/manager/0.log" Dec 01 10:30:36 crc kubenswrapper[4763]: I1201 10:30:36.201363 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-l547d_ee716572-1b36-4216-84a6-ad3f4ac2b7f6/kube-rbac-proxy/0.log" Dec 01 10:30:36 crc kubenswrapper[4763]: I1201 10:30:36.357745 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-xx8m2_5325eff2-4650-499e-9cad-f486bae74fce/kube-rbac-proxy/0.log" Dec 01 10:30:36 crc kubenswrapper[4763]: I1201 10:30:36.524235 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-xx8m2_5325eff2-4650-499e-9cad-f486bae74fce/manager/0.log" Dec 01 10:30:36 crc kubenswrapper[4763]: I1201 10:30:36.673551 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rjzdx_eae2e950-9f81-49cc-926e-380b81a0f0e7/kube-rbac-proxy/0.log" Dec 01 10:30:36 crc kubenswrapper[4763]: I1201 10:30:36.905283 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rjzdx_eae2e950-9f81-49cc-926e-380b81a0f0e7/manager/0.log" Dec 01 10:30:36 crc kubenswrapper[4763]: I1201 10:30:36.919374 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dbhs6_1e77763b-639f-46aa-a798-e39251aa8636/kube-rbac-proxy/0.log" Dec 01 10:30:37 crc kubenswrapper[4763]: I1201 10:30:37.185443 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-9vjrk_2aecf763-7a48-4c6f-a66c-ea391befd47a/kube-rbac-proxy/0.log" Dec 01 10:30:37 crc kubenswrapper[4763]: I1201 10:30:37.270503 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dbhs6_1e77763b-639f-46aa-a798-e39251aa8636/manager/0.log" Dec 01 10:30:37 crc kubenswrapper[4763]: I1201 10:30:37.388782 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-9vjrk_2aecf763-7a48-4c6f-a66c-ea391befd47a/manager/0.log" Dec 01 10:30:37 crc kubenswrapper[4763]: I1201 10:30:37.544187 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4srbr_c0ed7161-2907-48a6-894d-c6e3a1f47e0e/kube-rbac-proxy/0.log" Dec 01 10:30:37 crc kubenswrapper[4763]: I1201 10:30:37.597517 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4srbr_c0ed7161-2907-48a6-894d-c6e3a1f47e0e/manager/0.log" Dec 01 10:30:37 crc kubenswrapper[4763]: I1201 10:30:37.787327 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-t74fr_f5580ab2-73e2-4766-8e9c-f217fd4c079d/kube-rbac-proxy/0.log" Dec 01 10:30:38 crc kubenswrapper[4763]: I1201 10:30:38.195971 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-t74fr_f5580ab2-73e2-4766-8e9c-f217fd4c079d/manager/0.log" Dec 01 10:30:38 crc kubenswrapper[4763]: I1201 10:30:38.223490 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-9x2q4_4e44f450-61c5-4f49-b16b-8c9e0f060879/kube-rbac-proxy/0.log" Dec 01 10:30:38 crc kubenswrapper[4763]: I1201 10:30:38.365937 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-9x2q4_4e44f450-61c5-4f49-b16b-8c9e0f060879/manager/0.log" Dec 01 10:30:38 crc kubenswrapper[4763]: I1201 10:30:38.451383 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-nb5kc_9eef10a1-bfcc-412c-9687-fee23d90d448/kube-rbac-proxy/0.log" Dec 01 10:30:38 crc kubenswrapper[4763]: I1201 10:30:38.498279 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-nb5kc_9eef10a1-bfcc-412c-9687-fee23d90d448/manager/0.log" Dec 01 10:30:38 crc kubenswrapper[4763]: I1201 10:30:38.781922 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-9f5r9_a0713966-4e10-4b8b-84bc-6560d1b1bf5a/kube-rbac-proxy/0.log" Dec 01 10:30:38 crc kubenswrapper[4763]: I1201 10:30:38.783201 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-9f5r9_a0713966-4e10-4b8b-84bc-6560d1b1bf5a/manager/0.log" Dec 01 10:30:38 crc kubenswrapper[4763]: I1201 10:30:38.923838 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-jxvhd_0ee4b811-c59a-4120-bf78-53fe9e049d4b/kube-rbac-proxy/0.log" Dec 01 10:30:39 crc kubenswrapper[4763]: I1201 10:30:39.002160 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-jbvjc_ddef7d32-1d4c-496d-be36-7ae7af64205a/kube-rbac-proxy/0.log" Dec 01 10:30:39 crc kubenswrapper[4763]: I1201 10:30:39.142098 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-jxvhd_0ee4b811-c59a-4120-bf78-53fe9e049d4b/manager/0.log" Dec 01 10:30:39 crc kubenswrapper[4763]: I1201 10:30:39.196587 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-jbvjc_ddef7d32-1d4c-496d-be36-7ae7af64205a/manager/0.log" Dec 01 10:30:39 crc kubenswrapper[4763]: I1201 10:30:39.342672 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj_e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0/kube-rbac-proxy/0.log" Dec 01 10:30:39 crc kubenswrapper[4763]: I1201 10:30:39.343476 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj_e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0/manager/0.log" Dec 01 10:30:39 crc kubenswrapper[4763]: I1201 10:30:39.870226 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5794fdf75-8f5zw_de261a18-aec0-4ea5-aaf9-e313631599e6/operator/0.log" Dec 01 10:30:40 crc kubenswrapper[4763]: I1201 10:30:40.005969 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4qcm5_f78f08c4-82f9-45c0-92d4-325c6e066d44/registry-server/0.log" Dec 01 10:30:40 crc kubenswrapper[4763]: I1201 10:30:40.313150 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-k2nzk_c274cd4c-0c77-485c-8d8f-116a2f7b013b/kube-rbac-proxy/0.log" Dec 01 10:30:40 crc kubenswrapper[4763]: I1201 10:30:40.430664 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-k2nzk_c274cd4c-0c77-485c-8d8f-116a2f7b013b/manager/0.log" Dec 01 10:30:40 crc kubenswrapper[4763]: I1201 10:30:40.447895 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-tk5xf_13a2ac2b-0374-4da0-abbf-6aecbc3afbb8/kube-rbac-proxy/0.log" Dec 01 10:30:40 crc kubenswrapper[4763]: I1201 10:30:40.908439 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d555457c4-jcpzh_dd971a72-ce63-45cb-9457-43fcea25f677/manager/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.090672 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-tk5xf_13a2ac2b-0374-4da0-abbf-6aecbc3afbb8/manager/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.134495 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-s4rlp_6b7d748f-a9e4-416a-8fd7-9fa46ca2060d/operator/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.186938 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-mqvfc_20d8f0e3-8406-4e55-adbf-0681e090a82e/kube-rbac-proxy/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.245578 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-mqvfc_20d8f0e3-8406-4e55-adbf-0681e090a82e/manager/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.405551 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2skdl_89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0/manager/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.421753 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2skdl_89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0/kube-rbac-proxy/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.591759 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-64k9k_664dabb5-40f4-44c4-be9d-1870e153c877/kube-rbac-proxy/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.950981 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-64k9k_664dabb5-40f4-44c4-be9d-1870e153c877/manager/0.log" Dec 01 10:30:41 crc kubenswrapper[4763]: I1201 10:30:41.995960 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-ctr5d_3f494774-a168-4199-bfff-e73f64a669cf/manager/0.log" Dec 01 10:30:42 crc kubenswrapper[4763]: I1201 10:30:42.024577 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-ctr5d_3f494774-a168-4199-bfff-e73f64a669cf/kube-rbac-proxy/0.log" Dec 01 10:30:47 crc kubenswrapper[4763]: I1201 10:30:47.995057 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:30:47 crc kubenswrapper[4763]: E1201 10:30:47.995758 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:31:03 crc kubenswrapper[4763]: I1201 10:31:03.000489 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:31:03 crc kubenswrapper[4763]: E1201 10:31:03.001242 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:31:03 crc kubenswrapper[4763]: I1201 10:31:03.812428 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-l2f2g_f6329e56-18d1-4479-8699-897fdfdc60fb/control-plane-machine-set-operator/0.log" Dec 01 10:31:04 crc kubenswrapper[4763]: I1201 10:31:04.084227 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l4kcj_2307c38a-2af7-4b03-b99a-e5ca5bed76a8/kube-rbac-proxy/0.log" Dec 01 10:31:04 crc kubenswrapper[4763]: I1201 10:31:04.134101 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l4kcj_2307c38a-2af7-4b03-b99a-e5ca5bed76a8/machine-api-operator/0.log" Dec 01 10:31:18 crc kubenswrapper[4763]: I1201 10:31:18.000356 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:31:18 crc kubenswrapper[4763]: E1201 10:31:18.001216 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:31:19 crc kubenswrapper[4763]: I1201 10:31:19.396919 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-s2hg7_f4a5555c-4f44-4a2e-a9bf-6daab8490e32/cert-manager-controller/0.log" Dec 01 10:31:19 crc kubenswrapper[4763]: I1201 10:31:19.460953 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-tq2xd_99f9aa79-5bae-4215-b137-baef58e56e96/cert-manager-cainjector/0.log" Dec 01 10:31:19 crc kubenswrapper[4763]: I1201 10:31:19.639082 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-mjbm5_8c962037-246b-4727-8aab-6632e2e9e5f7/cert-manager-webhook/0.log" Dec 01 10:31:33 crc kubenswrapper[4763]: I1201 10:31:33.000957 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:31:33 crc kubenswrapper[4763]: E1201 10:31:33.001760 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:31:34 crc kubenswrapper[4763]: I1201 10:31:34.479030 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-rsnnh_757fd525-a1b2-45c0-a3eb-7b8c3d6423d3/nmstate-console-plugin/0.log" Dec 01 10:31:34 crc kubenswrapper[4763]: I1201 10:31:34.709489 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-h8q8q_1e79baf9-ce6c-4a92-891f-54eba3049168/nmstate-handler/0.log" Dec 01 10:31:34 crc kubenswrapper[4763]: I1201 10:31:34.747561 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-v2v9v_6789e53e-89e7-4593-a298-4b9eb0e0cf24/nmstate-metrics/0.log" Dec 01 10:31:34 crc kubenswrapper[4763]: I1201 10:31:34.762325 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-v2v9v_6789e53e-89e7-4593-a298-4b9eb0e0cf24/kube-rbac-proxy/0.log" Dec 01 10:31:34 crc kubenswrapper[4763]: I1201 10:31:34.984022 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-5vb8f_8b28a543-e1fd-4862-af9e-b4c77a652700/nmstate-operator/0.log" Dec 01 10:31:35 crc kubenswrapper[4763]: I1201 10:31:35.087936 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-z2gjv_31ef1fc8-adab-4f75-bac1-e8ff859eb425/nmstate-webhook/0.log" Dec 01 10:31:44 crc kubenswrapper[4763]: I1201 10:31:44.994062 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:31:44 crc kubenswrapper[4763]: E1201 10:31:44.994804 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:31:52 crc kubenswrapper[4763]: I1201 10:31:52.495958 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-68xth_39d8a539-ae28-40cb-b850-d40b3cc839b8/controller/0.log" Dec 01 10:31:52 crc kubenswrapper[4763]: I1201 10:31:52.505486 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-68xth_39d8a539-ae28-40cb-b850-d40b3cc839b8/kube-rbac-proxy/0.log" Dec 01 10:31:52 crc kubenswrapper[4763]: I1201 10:31:52.751688 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-frr-files/0.log" Dec 01 10:31:52 crc kubenswrapper[4763]: I1201 10:31:52.924534 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-reloader/0.log" Dec 01 10:31:52 crc kubenswrapper[4763]: I1201 10:31:52.968909 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-frr-files/0.log" Dec 01 10:31:52 crc kubenswrapper[4763]: I1201 10:31:52.987149 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-metrics/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.027208 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-reloader/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.232197 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-frr-files/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.267031 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-reloader/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.277471 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-metrics/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.315762 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-metrics/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.532936 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-frr-files/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.534421 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/controller/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.540053 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-metrics/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.542473 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-reloader/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.752684 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/frr-metrics/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.785050 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/kube-rbac-proxy-frr/0.log" Dec 01 10:31:53 crc kubenswrapper[4763]: I1201 10:31:53.824185 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/kube-rbac-proxy/0.log" Dec 01 10:31:54 crc kubenswrapper[4763]: I1201 10:31:54.065992 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-6kd8p_e1113818-415e-494e-8979-9de8da7db507/frr-k8s-webhook-server/0.log" Dec 01 10:31:54 crc kubenswrapper[4763]: I1201 10:31:54.068689 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/reloader/0.log" Dec 01 10:31:54 crc kubenswrapper[4763]: I1201 10:31:54.387025 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7488746df5-gj8c5_e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5/manager/0.log" Dec 01 10:31:54 crc kubenswrapper[4763]: I1201 10:31:54.718936 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ljjmd_bd336d9e-af02-4fc6-ae34-147c379ba374/kube-rbac-proxy/0.log" Dec 01 10:31:54 crc kubenswrapper[4763]: I1201 10:31:54.784814 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c7c865bc4-5b725_eeaeb040-1a35-4174-9c9c-7ffe226a79e5/webhook-server/0.log" Dec 01 10:31:55 crc kubenswrapper[4763]: I1201 10:31:55.288371 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/frr/0.log" Dec 01 10:31:55 crc kubenswrapper[4763]: I1201 10:31:55.482796 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ljjmd_bd336d9e-af02-4fc6-ae34-147c379ba374/speaker/0.log" Dec 01 10:31:55 crc kubenswrapper[4763]: I1201 10:31:55.994311 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:31:55 crc kubenswrapper[4763]: E1201 10:31:55.994613 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:32:07 crc kubenswrapper[4763]: I1201 10:32:07.995041 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:32:07 crc kubenswrapper[4763]: E1201 10:32:07.995888 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:32:09 crc kubenswrapper[4763]: I1201 10:32:09.904528 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/util/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.096249 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/pull/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.108585 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/util/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.121836 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/pull/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.335993 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/extract/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.340608 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/util/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.346727 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/pull/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.487388 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/util/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.738617 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/pull/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.753923 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/pull/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.822077 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/util/0.log" Dec 01 10:32:10 crc kubenswrapper[4763]: I1201 10:32:10.997162 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/util/0.log" Dec 01 10:32:11 crc kubenswrapper[4763]: I1201 10:32:11.043362 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/extract/0.log" Dec 01 10:32:11 crc kubenswrapper[4763]: I1201 10:32:11.173785 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/pull/0.log" Dec 01 10:32:11 crc kubenswrapper[4763]: I1201 10:32:11.303882 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-utilities/0.log" Dec 01 10:32:11 crc kubenswrapper[4763]: I1201 10:32:11.502067 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-content/0.log" Dec 01 10:32:11 crc kubenswrapper[4763]: I1201 10:32:11.523707 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-utilities/0.log" Dec 01 10:32:11 crc kubenswrapper[4763]: I1201 10:32:11.569441 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-content/0.log" Dec 01 10:32:11 crc kubenswrapper[4763]: I1201 10:32:11.719487 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-utilities/0.log" Dec 01 10:32:11 crc kubenswrapper[4763]: I1201 10:32:11.796335 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-content/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.035820 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-utilities/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.367252 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-utilities/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.416547 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-content/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.422835 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/registry-server/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.467064 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-content/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.622033 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-content/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.674889 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-utilities/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.969117 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/registry-server/0.log" Dec 01 10:32:12 crc kubenswrapper[4763]: I1201 10:32:12.984054 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7mwbs_8ed98359-8184-409c-9f5d-f2b2b21b9cb7/marketplace-operator/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.100055 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-utilities/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.321870 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-utilities/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.339437 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-content/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.341637 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-content/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.522355 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-utilities/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.552659 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-content/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.707490 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/registry-server/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.778823 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-utilities/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.926257 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-content/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.957447 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-utilities/0.log" Dec 01 10:32:13 crc kubenswrapper[4763]: I1201 10:32:13.980626 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-content/0.log" Dec 01 10:32:14 crc kubenswrapper[4763]: I1201 10:32:14.105677 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-utilities/0.log" Dec 01 10:32:14 crc kubenswrapper[4763]: I1201 10:32:14.124670 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-content/0.log" Dec 01 10:32:14 crc kubenswrapper[4763]: I1201 10:32:14.267640 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/registry-server/0.log" Dec 01 10:32:18 crc kubenswrapper[4763]: I1201 10:32:18.995526 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:32:18 crc kubenswrapper[4763]: E1201 10:32:18.996207 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:32:33 crc kubenswrapper[4763]: I1201 10:32:33.001165 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:32:33 crc kubenswrapper[4763]: E1201 10:32:33.002085 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:32:45 crc kubenswrapper[4763]: I1201 10:32:45.994777 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:32:45 crc kubenswrapper[4763]: E1201 10:32:45.995508 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:32:58 crc kubenswrapper[4763]: I1201 10:32:58.994131 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:32:58 crc kubenswrapper[4763]: E1201 10:32:58.995048 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.759172 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wrpmx"] Dec 01 10:33:07 crc kubenswrapper[4763]: E1201 10:33:07.761541 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1db81a5-b618-460e-bff5-772c9bc8096f" containerName="collect-profiles" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.761644 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1db81a5-b618-460e-bff5-772c9bc8096f" containerName="collect-profiles" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.761949 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1db81a5-b618-460e-bff5-772c9bc8096f" containerName="collect-profiles" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.763830 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.797833 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrpmx"] Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.896438 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-utilities\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.896570 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-catalog-content\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.896668 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7bxn\" (UniqueName: \"kubernetes.io/projected/d676bd7a-0e7a-4277-9c05-800f6a20f809-kube-api-access-n7bxn\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.998661 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-utilities\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.998983 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-catalog-content\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.999129 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7bxn\" (UniqueName: \"kubernetes.io/projected/d676bd7a-0e7a-4277-9c05-800f6a20f809-kube-api-access-n7bxn\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.999215 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-utilities\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:07 crc kubenswrapper[4763]: I1201 10:33:07.999358 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-catalog-content\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:08 crc kubenswrapper[4763]: I1201 10:33:08.029486 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7bxn\" (UniqueName: \"kubernetes.io/projected/d676bd7a-0e7a-4277-9c05-800f6a20f809-kube-api-access-n7bxn\") pod \"redhat-marketplace-wrpmx\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:08 crc kubenswrapper[4763]: I1201 10:33:08.086723 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:08 crc kubenswrapper[4763]: I1201 10:33:08.712263 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrpmx"] Dec 01 10:33:09 crc kubenswrapper[4763]: W1201 10:33:09.115965 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd676bd7a_0e7a_4277_9c05_800f6a20f809.slice/crio-f10512d8ef3578fe94b797c71f1657555b89d2b01aaf4901e08bf1ed7f6ef95c WatchSource:0}: Error finding container f10512d8ef3578fe94b797c71f1657555b89d2b01aaf4901e08bf1ed7f6ef95c: Status 404 returned error can't find the container with id f10512d8ef3578fe94b797c71f1657555b89d2b01aaf4901e08bf1ed7f6ef95c Dec 01 10:33:10 crc kubenswrapper[4763]: I1201 10:33:10.048828 4763 generic.go:334] "Generic (PLEG): container finished" podID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerID="2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7" exitCode=0 Dec 01 10:33:10 crc kubenswrapper[4763]: I1201 10:33:10.048934 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrpmx" event={"ID":"d676bd7a-0e7a-4277-9c05-800f6a20f809","Type":"ContainerDied","Data":"2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7"} Dec 01 10:33:10 crc kubenswrapper[4763]: I1201 10:33:10.049234 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrpmx" event={"ID":"d676bd7a-0e7a-4277-9c05-800f6a20f809","Type":"ContainerStarted","Data":"f10512d8ef3578fe94b797c71f1657555b89d2b01aaf4901e08bf1ed7f6ef95c"} Dec 01 10:33:10 crc kubenswrapper[4763]: I1201 10:33:10.050692 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:33:13 crc kubenswrapper[4763]: I1201 10:33:13.003268 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:33:13 crc kubenswrapper[4763]: E1201 10:33:13.005022 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:33:13 crc kubenswrapper[4763]: I1201 10:33:13.081008 4763 generic.go:334] "Generic (PLEG): container finished" podID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerID="f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5" exitCode=0 Dec 01 10:33:13 crc kubenswrapper[4763]: I1201 10:33:13.081265 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrpmx" event={"ID":"d676bd7a-0e7a-4277-9c05-800f6a20f809","Type":"ContainerDied","Data":"f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5"} Dec 01 10:33:14 crc kubenswrapper[4763]: I1201 10:33:14.093666 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrpmx" event={"ID":"d676bd7a-0e7a-4277-9c05-800f6a20f809","Type":"ContainerStarted","Data":"8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12"} Dec 01 10:33:14 crc kubenswrapper[4763]: I1201 10:33:14.120341 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wrpmx" podStartSLOduration=3.601646748 podStartE2EDuration="7.120323121s" podCreationTimestamp="2025-12-01 10:33:07 +0000 UTC" firstStartedPulling="2025-12-01 10:33:10.05036915 +0000 UTC m=+4707.319017918" lastFinishedPulling="2025-12-01 10:33:13.569045483 +0000 UTC m=+4710.837694291" observedRunningTime="2025-12-01 10:33:14.114486092 +0000 UTC m=+4711.383134860" watchObservedRunningTime="2025-12-01 10:33:14.120323121 +0000 UTC m=+4711.388971889" Dec 01 10:33:18 crc kubenswrapper[4763]: I1201 10:33:18.087027 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:18 crc kubenswrapper[4763]: I1201 10:33:18.087652 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:18 crc kubenswrapper[4763]: I1201 10:33:18.210552 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:18 crc kubenswrapper[4763]: I1201 10:33:18.287142 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:18 crc kubenswrapper[4763]: I1201 10:33:18.451818 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrpmx"] Dec 01 10:33:20 crc kubenswrapper[4763]: I1201 10:33:20.151345 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wrpmx" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerName="registry-server" containerID="cri-o://8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12" gracePeriod=2 Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.161051 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.161428 4763 generic.go:334] "Generic (PLEG): container finished" podID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerID="8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12" exitCode=0 Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.161453 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrpmx" event={"ID":"d676bd7a-0e7a-4277-9c05-800f6a20f809","Type":"ContainerDied","Data":"8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12"} Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.161709 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wrpmx" event={"ID":"d676bd7a-0e7a-4277-9c05-800f6a20f809","Type":"ContainerDied","Data":"f10512d8ef3578fe94b797c71f1657555b89d2b01aaf4901e08bf1ed7f6ef95c"} Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.161730 4763 scope.go:117] "RemoveContainer" containerID="8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.187159 4763 scope.go:117] "RemoveContainer" containerID="f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.217834 4763 scope.go:117] "RemoveContainer" containerID="2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.265833 4763 scope.go:117] "RemoveContainer" containerID="8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12" Dec 01 10:33:21 crc kubenswrapper[4763]: E1201 10:33:21.266192 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12\": container with ID starting with 8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12 not found: ID does not exist" containerID="8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.266234 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12"} err="failed to get container status \"8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12\": rpc error: code = NotFound desc = could not find container \"8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12\": container with ID starting with 8f1a0b039b6068c256cd0870820b9f9b0dcdf354a86717abdef576a556b73b12 not found: ID does not exist" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.266266 4763 scope.go:117] "RemoveContainer" containerID="f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5" Dec 01 10:33:21 crc kubenswrapper[4763]: E1201 10:33:21.266691 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5\": container with ID starting with f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5 not found: ID does not exist" containerID="f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.266721 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5"} err="failed to get container status \"f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5\": rpc error: code = NotFound desc = could not find container \"f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5\": container with ID starting with f713417729ecc37a82b6e0a37a52d9bbd66fc55e60c597ff65c0e271f163c3d5 not found: ID does not exist" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.266741 4763 scope.go:117] "RemoveContainer" containerID="2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7" Dec 01 10:33:21 crc kubenswrapper[4763]: E1201 10:33:21.266951 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7\": container with ID starting with 2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7 not found: ID does not exist" containerID="2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.266975 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7"} err="failed to get container status \"2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7\": rpc error: code = NotFound desc = could not find container \"2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7\": container with ID starting with 2c7e94cf6977f8591cb7bceed4fb3b823af6531ed4abd8f477ea6f1151555bb7 not found: ID does not exist" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.276022 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-catalog-content\") pod \"d676bd7a-0e7a-4277-9c05-800f6a20f809\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.276196 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-utilities\") pod \"d676bd7a-0e7a-4277-9c05-800f6a20f809\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.276277 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7bxn\" (UniqueName: \"kubernetes.io/projected/d676bd7a-0e7a-4277-9c05-800f6a20f809-kube-api-access-n7bxn\") pod \"d676bd7a-0e7a-4277-9c05-800f6a20f809\" (UID: \"d676bd7a-0e7a-4277-9c05-800f6a20f809\") " Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.277071 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-utilities" (OuterVolumeSpecName: "utilities") pod "d676bd7a-0e7a-4277-9c05-800f6a20f809" (UID: "d676bd7a-0e7a-4277-9c05-800f6a20f809"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.283694 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d676bd7a-0e7a-4277-9c05-800f6a20f809-kube-api-access-n7bxn" (OuterVolumeSpecName: "kube-api-access-n7bxn") pod "d676bd7a-0e7a-4277-9c05-800f6a20f809" (UID: "d676bd7a-0e7a-4277-9c05-800f6a20f809"). InnerVolumeSpecName "kube-api-access-n7bxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.299977 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d676bd7a-0e7a-4277-9c05-800f6a20f809" (UID: "d676bd7a-0e7a-4277-9c05-800f6a20f809"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.379178 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.379221 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7bxn\" (UniqueName: \"kubernetes.io/projected/d676bd7a-0e7a-4277-9c05-800f6a20f809-kube-api-access-n7bxn\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:21 crc kubenswrapper[4763]: I1201 10:33:21.379239 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d676bd7a-0e7a-4277-9c05-800f6a20f809-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:33:22 crc kubenswrapper[4763]: I1201 10:33:22.179576 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wrpmx" Dec 01 10:33:22 crc kubenswrapper[4763]: I1201 10:33:22.237730 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrpmx"] Dec 01 10:33:22 crc kubenswrapper[4763]: I1201 10:33:22.251748 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wrpmx"] Dec 01 10:33:23 crc kubenswrapper[4763]: I1201 10:33:23.005525 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" path="/var/lib/kubelet/pods/d676bd7a-0e7a-4277-9c05-800f6a20f809/volumes" Dec 01 10:33:27 crc kubenswrapper[4763]: I1201 10:33:27.994421 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:33:27 crc kubenswrapper[4763]: E1201 10:33:27.995065 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:33:38 crc kubenswrapper[4763]: I1201 10:33:38.995996 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:33:38 crc kubenswrapper[4763]: E1201 10:33:38.997018 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:33:49 crc kubenswrapper[4763]: I1201 10:33:49.994362 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:33:49 crc kubenswrapper[4763]: E1201 10:33:49.995173 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:34:01 crc kubenswrapper[4763]: I1201 10:34:01.995143 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:34:02 crc kubenswrapper[4763]: E1201 10:34:02.001060 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.439138 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dks56"] Dec 01 10:34:08 crc kubenswrapper[4763]: E1201 10:34:08.440437 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerName="extract-content" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.440482 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerName="extract-content" Dec 01 10:34:08 crc kubenswrapper[4763]: E1201 10:34:08.440504 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerName="extract-utilities" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.440515 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerName="extract-utilities" Dec 01 10:34:08 crc kubenswrapper[4763]: E1201 10:34:08.440532 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerName="registry-server" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.440543 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerName="registry-server" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.440912 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d676bd7a-0e7a-4277-9c05-800f6a20f809" containerName="registry-server" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.443199 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.447915 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dks56"] Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.532838 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-catalog-content\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.533037 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-utilities\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.533195 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6s7r\" (UniqueName: \"kubernetes.io/projected/1a8bd790-1990-4b82-b4c7-8828a50c1b25-kube-api-access-b6s7r\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.634811 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-catalog-content\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.634987 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-utilities\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.635050 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6s7r\" (UniqueName: \"kubernetes.io/projected/1a8bd790-1990-4b82-b4c7-8828a50c1b25-kube-api-access-b6s7r\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.635365 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-catalog-content\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.635719 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-utilities\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.663466 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6s7r\" (UniqueName: \"kubernetes.io/projected/1a8bd790-1990-4b82-b4c7-8828a50c1b25-kube-api-access-b6s7r\") pod \"certified-operators-dks56\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:08 crc kubenswrapper[4763]: I1201 10:34:08.771970 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:09 crc kubenswrapper[4763]: I1201 10:34:09.690955 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dks56"] Dec 01 10:34:10 crc kubenswrapper[4763]: I1201 10:34:10.641505 4763 generic.go:334] "Generic (PLEG): container finished" podID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerID="f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed" exitCode=0 Dec 01 10:34:10 crc kubenswrapper[4763]: I1201 10:34:10.641683 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dks56" event={"ID":"1a8bd790-1990-4b82-b4c7-8828a50c1b25","Type":"ContainerDied","Data":"f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed"} Dec 01 10:34:10 crc kubenswrapper[4763]: I1201 10:34:10.641840 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dks56" event={"ID":"1a8bd790-1990-4b82-b4c7-8828a50c1b25","Type":"ContainerStarted","Data":"ced4649bcbdc6fe7bc0b354fd3dc80f4e1aabce81bb610addc3726c587ac8ae7"} Dec 01 10:34:11 crc kubenswrapper[4763]: I1201 10:34:11.655985 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dks56" event={"ID":"1a8bd790-1990-4b82-b4c7-8828a50c1b25","Type":"ContainerStarted","Data":"2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01"} Dec 01 10:34:12 crc kubenswrapper[4763]: I1201 10:34:12.675014 4763 generic.go:334] "Generic (PLEG): container finished" podID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerID="2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01" exitCode=0 Dec 01 10:34:12 crc kubenswrapper[4763]: I1201 10:34:12.675064 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dks56" event={"ID":"1a8bd790-1990-4b82-b4c7-8828a50c1b25","Type":"ContainerDied","Data":"2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01"} Dec 01 10:34:12 crc kubenswrapper[4763]: I1201 10:34:12.994560 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:34:12 crc kubenswrapper[4763]: E1201 10:34:12.994910 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:34:14 crc kubenswrapper[4763]: I1201 10:34:14.694627 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dks56" event={"ID":"1a8bd790-1990-4b82-b4c7-8828a50c1b25","Type":"ContainerStarted","Data":"b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328"} Dec 01 10:34:14 crc kubenswrapper[4763]: I1201 10:34:14.716745 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dks56" podStartSLOduration=3.596595914 podStartE2EDuration="6.716727647s" podCreationTimestamp="2025-12-01 10:34:08 +0000 UTC" firstStartedPulling="2025-12-01 10:34:10.644209967 +0000 UTC m=+4767.912858735" lastFinishedPulling="2025-12-01 10:34:13.7643417 +0000 UTC m=+4771.032990468" observedRunningTime="2025-12-01 10:34:14.7105714 +0000 UTC m=+4771.979220168" watchObservedRunningTime="2025-12-01 10:34:14.716727647 +0000 UTC m=+4771.985376415" Dec 01 10:34:17 crc kubenswrapper[4763]: I1201 10:34:17.828431 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-67pjt"] Dec 01 10:34:17 crc kubenswrapper[4763]: I1201 10:34:17.839422 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:17 crc kubenswrapper[4763]: I1201 10:34:17.857385 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-67pjt"] Dec 01 10:34:17 crc kubenswrapper[4763]: I1201 10:34:17.967995 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-utilities\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:17 crc kubenswrapper[4763]: I1201 10:34:17.968067 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-catalog-content\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:17 crc kubenswrapper[4763]: I1201 10:34:17.968212 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt7zw\" (UniqueName: \"kubernetes.io/projected/183f4ab1-f35d-4b1c-987e-033dff124282-kube-api-access-bt7zw\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.070516 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-utilities\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.070585 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-catalog-content\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.070814 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt7zw\" (UniqueName: \"kubernetes.io/projected/183f4ab1-f35d-4b1c-987e-033dff124282-kube-api-access-bt7zw\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.072443 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-utilities\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.072780 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-catalog-content\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.097263 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt7zw\" (UniqueName: \"kubernetes.io/projected/183f4ab1-f35d-4b1c-987e-033dff124282-kube-api-access-bt7zw\") pod \"redhat-operators-67pjt\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.179327 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:18 crc kubenswrapper[4763]: W1201 10:34:18.745927 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183f4ab1_f35d_4b1c_987e_033dff124282.slice/crio-f3655232ce787be803538cdb912750ff0cebf986d294e2d0cff9edcbf7bed958 WatchSource:0}: Error finding container f3655232ce787be803538cdb912750ff0cebf986d294e2d0cff9edcbf7bed958: Status 404 returned error can't find the container with id f3655232ce787be803538cdb912750ff0cebf986d294e2d0cff9edcbf7bed958 Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.754904 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-67pjt"] Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.772706 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.772781 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:18 crc kubenswrapper[4763]: I1201 10:34:18.837380 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:19 crc kubenswrapper[4763]: E1201 10:34:19.229327 4763 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183f4ab1_f35d_4b1c_987e_033dff124282.slice/crio-05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183f4ab1_f35d_4b1c_987e_033dff124282.slice/crio-conmon-05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5.scope\": RecentStats: unable to find data in memory cache]" Dec 01 10:34:19 crc kubenswrapper[4763]: I1201 10:34:19.744447 4763 generic.go:334] "Generic (PLEG): container finished" podID="183f4ab1-f35d-4b1c-987e-033dff124282" containerID="05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5" exitCode=0 Dec 01 10:34:19 crc kubenswrapper[4763]: I1201 10:34:19.744722 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pjt" event={"ID":"183f4ab1-f35d-4b1c-987e-033dff124282","Type":"ContainerDied","Data":"05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5"} Dec 01 10:34:19 crc kubenswrapper[4763]: I1201 10:34:19.747124 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pjt" event={"ID":"183f4ab1-f35d-4b1c-987e-033dff124282","Type":"ContainerStarted","Data":"f3655232ce787be803538cdb912750ff0cebf986d294e2d0cff9edcbf7bed958"} Dec 01 10:34:19 crc kubenswrapper[4763]: I1201 10:34:19.804343 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:20 crc kubenswrapper[4763]: I1201 10:34:20.755825 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pjt" event={"ID":"183f4ab1-f35d-4b1c-987e-033dff124282","Type":"ContainerStarted","Data":"8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881"} Dec 01 10:34:21 crc kubenswrapper[4763]: I1201 10:34:21.202697 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dks56"] Dec 01 10:34:21 crc kubenswrapper[4763]: I1201 10:34:21.764092 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dks56" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerName="registry-server" containerID="cri-o://b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328" gracePeriod=2 Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.255908 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.379438 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-catalog-content\") pod \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.379500 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-utilities\") pod \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.379555 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6s7r\" (UniqueName: \"kubernetes.io/projected/1a8bd790-1990-4b82-b4c7-8828a50c1b25-kube-api-access-b6s7r\") pod \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\" (UID: \"1a8bd790-1990-4b82-b4c7-8828a50c1b25\") " Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.380318 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-utilities" (OuterVolumeSpecName: "utilities") pod "1a8bd790-1990-4b82-b4c7-8828a50c1b25" (UID: "1a8bd790-1990-4b82-b4c7-8828a50c1b25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.404762 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8bd790-1990-4b82-b4c7-8828a50c1b25-kube-api-access-b6s7r" (OuterVolumeSpecName: "kube-api-access-b6s7r") pod "1a8bd790-1990-4b82-b4c7-8828a50c1b25" (UID: "1a8bd790-1990-4b82-b4c7-8828a50c1b25"). InnerVolumeSpecName "kube-api-access-b6s7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.426502 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1a8bd790-1990-4b82-b4c7-8828a50c1b25" (UID: "1a8bd790-1990-4b82-b4c7-8828a50c1b25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.481908 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6s7r\" (UniqueName: \"kubernetes.io/projected/1a8bd790-1990-4b82-b4c7-8828a50c1b25-kube-api-access-b6s7r\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.481960 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.481975 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1a8bd790-1990-4b82-b4c7-8828a50c1b25-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.777165 4763 generic.go:334] "Generic (PLEG): container finished" podID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerID="b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328" exitCode=0 Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.777221 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dks56" event={"ID":"1a8bd790-1990-4b82-b4c7-8828a50c1b25","Type":"ContainerDied","Data":"b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328"} Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.777264 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dks56" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.777289 4763 scope.go:117] "RemoveContainer" containerID="b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.777271 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dks56" event={"ID":"1a8bd790-1990-4b82-b4c7-8828a50c1b25","Type":"ContainerDied","Data":"ced4649bcbdc6fe7bc0b354fd3dc80f4e1aabce81bb610addc3726c587ac8ae7"} Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.806740 4763 scope.go:117] "RemoveContainer" containerID="2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.848824 4763 scope.go:117] "RemoveContainer" containerID="f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.882732 4763 scope.go:117] "RemoveContainer" containerID="b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328" Dec 01 10:34:22 crc kubenswrapper[4763]: E1201 10:34:22.883236 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328\": container with ID starting with b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328 not found: ID does not exist" containerID="b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.883290 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328"} err="failed to get container status \"b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328\": rpc error: code = NotFound desc = could not find container \"b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328\": container with ID starting with b59e0793087238f2df5aef84946ac7a2ee51cb77330fe2d9b2d8c655ed0c3328 not found: ID does not exist" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.883316 4763 scope.go:117] "RemoveContainer" containerID="2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01" Dec 01 10:34:22 crc kubenswrapper[4763]: E1201 10:34:22.883917 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01\": container with ID starting with 2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01 not found: ID does not exist" containerID="2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.883941 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01"} err="failed to get container status \"2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01\": rpc error: code = NotFound desc = could not find container \"2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01\": container with ID starting with 2ff18ed5fec5c3f28ab1054c731b52930c04eb4a8d997122c264d5b07173ac01 not found: ID does not exist" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.883956 4763 scope.go:117] "RemoveContainer" containerID="f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed" Dec 01 10:34:22 crc kubenswrapper[4763]: E1201 10:34:22.884398 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed\": container with ID starting with f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed not found: ID does not exist" containerID="f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.884449 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed"} err="failed to get container status \"f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed\": rpc error: code = NotFound desc = could not find container \"f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed\": container with ID starting with f8019dbde88f5b8dd891939dd2f5fdcd8a1a106962e0b6328822967436d333ed not found: ID does not exist" Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.942542 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dks56"] Dec 01 10:34:22 crc kubenswrapper[4763]: I1201 10:34:22.953901 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dks56"] Dec 01 10:34:23 crc kubenswrapper[4763]: I1201 10:34:23.006631 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" path="/var/lib/kubelet/pods/1a8bd790-1990-4b82-b4c7-8828a50c1b25/volumes" Dec 01 10:34:25 crc kubenswrapper[4763]: I1201 10:34:25.817690 4763 generic.go:334] "Generic (PLEG): container finished" podID="183f4ab1-f35d-4b1c-987e-033dff124282" containerID="8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881" exitCode=0 Dec 01 10:34:25 crc kubenswrapper[4763]: I1201 10:34:25.817742 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pjt" event={"ID":"183f4ab1-f35d-4b1c-987e-033dff124282","Type":"ContainerDied","Data":"8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881"} Dec 01 10:34:25 crc kubenswrapper[4763]: I1201 10:34:25.994542 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:34:25 crc kubenswrapper[4763]: E1201 10:34:25.994795 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:34:26 crc kubenswrapper[4763]: I1201 10:34:26.828427 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pjt" event={"ID":"183f4ab1-f35d-4b1c-987e-033dff124282","Type":"ContainerStarted","Data":"2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b"} Dec 01 10:34:26 crc kubenswrapper[4763]: I1201 10:34:26.846143 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-67pjt" podStartSLOduration=3.322379942 podStartE2EDuration="9.846123014s" podCreationTimestamp="2025-12-01 10:34:17 +0000 UTC" firstStartedPulling="2025-12-01 10:34:19.74810199 +0000 UTC m=+4777.016750758" lastFinishedPulling="2025-12-01 10:34:26.271845052 +0000 UTC m=+4783.540493830" observedRunningTime="2025-12-01 10:34:26.844189251 +0000 UTC m=+4784.112838019" watchObservedRunningTime="2025-12-01 10:34:26.846123014 +0000 UTC m=+4784.114771782" Dec 01 10:34:28 crc kubenswrapper[4763]: I1201 10:34:28.179919 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:28 crc kubenswrapper[4763]: I1201 10:34:28.181331 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:29 crc kubenswrapper[4763]: I1201 10:34:29.232247 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-67pjt" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="registry-server" probeResult="failure" output=< Dec 01 10:34:29 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 10:34:29 crc kubenswrapper[4763]: > Dec 01 10:34:36 crc kubenswrapper[4763]: I1201 10:34:36.930702 4763 generic.go:334] "Generic (PLEG): container finished" podID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerID="de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6" exitCode=0 Dec 01 10:34:36 crc kubenswrapper[4763]: I1201 10:34:36.930779 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-prg7h/must-gather-mvdt6" event={"ID":"21287b27-72e1-4209-ad3a-412d76cdd1fe","Type":"ContainerDied","Data":"de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6"} Dec 01 10:34:36 crc kubenswrapper[4763]: I1201 10:34:36.932091 4763 scope.go:117] "RemoveContainer" containerID="de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6" Dec 01 10:34:37 crc kubenswrapper[4763]: I1201 10:34:37.093869 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-prg7h_must-gather-mvdt6_21287b27-72e1-4209-ad3a-412d76cdd1fe/gather/0.log" Dec 01 10:34:38 crc kubenswrapper[4763]: I1201 10:34:38.231849 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:38 crc kubenswrapper[4763]: I1201 10:34:38.291996 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:38 crc kubenswrapper[4763]: I1201 10:34:38.479105 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-67pjt"] Dec 01 10:34:39 crc kubenswrapper[4763]: I1201 10:34:39.979717 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-67pjt" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="registry-server" containerID="cri-o://2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b" gracePeriod=2 Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.530416 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.575949 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt7zw\" (UniqueName: \"kubernetes.io/projected/183f4ab1-f35d-4b1c-987e-033dff124282-kube-api-access-bt7zw\") pod \"183f4ab1-f35d-4b1c-987e-033dff124282\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.576019 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-catalog-content\") pod \"183f4ab1-f35d-4b1c-987e-033dff124282\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.576062 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-utilities\") pod \"183f4ab1-f35d-4b1c-987e-033dff124282\" (UID: \"183f4ab1-f35d-4b1c-987e-033dff124282\") " Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.577255 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-utilities" (OuterVolumeSpecName: "utilities") pod "183f4ab1-f35d-4b1c-987e-033dff124282" (UID: "183f4ab1-f35d-4b1c-987e-033dff124282"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.578318 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.586170 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183f4ab1-f35d-4b1c-987e-033dff124282-kube-api-access-bt7zw" (OuterVolumeSpecName: "kube-api-access-bt7zw") pod "183f4ab1-f35d-4b1c-987e-033dff124282" (UID: "183f4ab1-f35d-4b1c-987e-033dff124282"). InnerVolumeSpecName "kube-api-access-bt7zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.680883 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt7zw\" (UniqueName: \"kubernetes.io/projected/183f4ab1-f35d-4b1c-987e-033dff124282-kube-api-access-bt7zw\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.690915 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "183f4ab1-f35d-4b1c-987e-033dff124282" (UID: "183f4ab1-f35d-4b1c-987e-033dff124282"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.783348 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183f4ab1-f35d-4b1c-987e-033dff124282-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.993101 4763 generic.go:334] "Generic (PLEG): container finished" podID="183f4ab1-f35d-4b1c-987e-033dff124282" containerID="2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b" exitCode=0 Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.993218 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-67pjt" Dec 01 10:34:40 crc kubenswrapper[4763]: I1201 10:34:40.995303 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:34:40 crc kubenswrapper[4763]: E1201 10:34:40.996093 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.007738 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pjt" event={"ID":"183f4ab1-f35d-4b1c-987e-033dff124282","Type":"ContainerDied","Data":"2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b"} Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.007777 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-67pjt" event={"ID":"183f4ab1-f35d-4b1c-987e-033dff124282","Type":"ContainerDied","Data":"f3655232ce787be803538cdb912750ff0cebf986d294e2d0cff9edcbf7bed958"} Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.007796 4763 scope.go:117] "RemoveContainer" containerID="2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.037100 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-67pjt"] Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.041624 4763 scope.go:117] "RemoveContainer" containerID="8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.047285 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-67pjt"] Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.077633 4763 scope.go:117] "RemoveContainer" containerID="05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.116029 4763 scope.go:117] "RemoveContainer" containerID="2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b" Dec 01 10:34:41 crc kubenswrapper[4763]: E1201 10:34:41.116400 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b\": container with ID starting with 2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b not found: ID does not exist" containerID="2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.116433 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b"} err="failed to get container status \"2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b\": rpc error: code = NotFound desc = could not find container \"2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b\": container with ID starting with 2b2ebc927bc2191194a6c822dd9912c86e72afe82a6d6b79e052ead710f2214b not found: ID does not exist" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.116502 4763 scope.go:117] "RemoveContainer" containerID="8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881" Dec 01 10:34:41 crc kubenswrapper[4763]: E1201 10:34:41.116737 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881\": container with ID starting with 8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881 not found: ID does not exist" containerID="8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.116758 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881"} err="failed to get container status \"8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881\": rpc error: code = NotFound desc = could not find container \"8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881\": container with ID starting with 8d99e7fb36c01771dd96a38420f8be8d89accb8cdfc5f9edcffb6a44d228a881 not found: ID does not exist" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.116771 4763 scope.go:117] "RemoveContainer" containerID="05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5" Dec 01 10:34:41 crc kubenswrapper[4763]: E1201 10:34:41.116959 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5\": container with ID starting with 05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5 not found: ID does not exist" containerID="05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5" Dec 01 10:34:41 crc kubenswrapper[4763]: I1201 10:34:41.116979 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5"} err="failed to get container status \"05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5\": rpc error: code = NotFound desc = could not find container \"05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5\": container with ID starting with 05dd6f09b066a960dbcf61a297c7f60cf948deb1e27c870ab9b5d8d9d37300c5 not found: ID does not exist" Dec 01 10:34:43 crc kubenswrapper[4763]: I1201 10:34:43.022148 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" path="/var/lib/kubelet/pods/183f4ab1-f35d-4b1c-987e-033dff124282/volumes" Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.248438 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-prg7h/must-gather-mvdt6"] Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.249362 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-prg7h/must-gather-mvdt6" podUID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerName="copy" containerID="cri-o://e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1" gracePeriod=2 Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.260233 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-prg7h/must-gather-mvdt6"] Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.754815 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-prg7h_must-gather-mvdt6_21287b27-72e1-4209-ad3a-412d76cdd1fe/copy/0.log" Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.755528 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.827888 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skmmx\" (UniqueName: \"kubernetes.io/projected/21287b27-72e1-4209-ad3a-412d76cdd1fe-kube-api-access-skmmx\") pod \"21287b27-72e1-4209-ad3a-412d76cdd1fe\" (UID: \"21287b27-72e1-4209-ad3a-412d76cdd1fe\") " Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.828016 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/21287b27-72e1-4209-ad3a-412d76cdd1fe-must-gather-output\") pod \"21287b27-72e1-4209-ad3a-412d76cdd1fe\" (UID: \"21287b27-72e1-4209-ad3a-412d76cdd1fe\") " Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.834359 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21287b27-72e1-4209-ad3a-412d76cdd1fe-kube-api-access-skmmx" (OuterVolumeSpecName: "kube-api-access-skmmx") pod "21287b27-72e1-4209-ad3a-412d76cdd1fe" (UID: "21287b27-72e1-4209-ad3a-412d76cdd1fe"). InnerVolumeSpecName "kube-api-access-skmmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:34:47 crc kubenswrapper[4763]: I1201 10:34:47.930568 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skmmx\" (UniqueName: \"kubernetes.io/projected/21287b27-72e1-4209-ad3a-412d76cdd1fe-kube-api-access-skmmx\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.006107 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21287b27-72e1-4209-ad3a-412d76cdd1fe-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "21287b27-72e1-4209-ad3a-412d76cdd1fe" (UID: "21287b27-72e1-4209-ad3a-412d76cdd1fe"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.038819 4763 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/21287b27-72e1-4209-ad3a-412d76cdd1fe-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.070231 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-prg7h_must-gather-mvdt6_21287b27-72e1-4209-ad3a-412d76cdd1fe/copy/0.log" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.070674 4763 generic.go:334] "Generic (PLEG): container finished" podID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerID="e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1" exitCode=143 Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.070809 4763 scope.go:117] "RemoveContainer" containerID="e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.071021 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-prg7h/must-gather-mvdt6" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.116632 4763 scope.go:117] "RemoveContainer" containerID="de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.170825 4763 scope.go:117] "RemoveContainer" containerID="e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1" Dec 01 10:34:48 crc kubenswrapper[4763]: E1201 10:34:48.171802 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1\": container with ID starting with e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1 not found: ID does not exist" containerID="e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.171919 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1"} err="failed to get container status \"e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1\": rpc error: code = NotFound desc = could not find container \"e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1\": container with ID starting with e0613db92d2db391deab91136decc0fc5ec69f21b2e346bfb2ac40393dd222d1 not found: ID does not exist" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.172032 4763 scope.go:117] "RemoveContainer" containerID="de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6" Dec 01 10:34:48 crc kubenswrapper[4763]: E1201 10:34:48.172477 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6\": container with ID starting with de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6 not found: ID does not exist" containerID="de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6" Dec 01 10:34:48 crc kubenswrapper[4763]: I1201 10:34:48.172592 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6"} err="failed to get container status \"de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6\": rpc error: code = NotFound desc = could not find container \"de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6\": container with ID starting with de0d1e677424fb73eb5d8ab3d7673006f012a34e268b151b216a95f95ae44ac6 not found: ID does not exist" Dec 01 10:34:49 crc kubenswrapper[4763]: I1201 10:34:49.006310 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21287b27-72e1-4209-ad3a-412d76cdd1fe" path="/var/lib/kubelet/pods/21287b27-72e1-4209-ad3a-412d76cdd1fe/volumes" Dec 01 10:34:51 crc kubenswrapper[4763]: I1201 10:34:51.994209 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:34:51 crc kubenswrapper[4763]: E1201 10:34:51.995019 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:35:03 crc kubenswrapper[4763]: I1201 10:35:03.994865 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:35:03 crc kubenswrapper[4763]: E1201 10:35:03.995714 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:35:18 crc kubenswrapper[4763]: I1201 10:35:18.994682 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:35:18 crc kubenswrapper[4763]: E1201 10:35:18.995847 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:35:33 crc kubenswrapper[4763]: I1201 10:35:33.995539 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:35:34 crc kubenswrapper[4763]: I1201 10:35:34.489732 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"b11377138eb89562b1cf83f3e428880f88f84a0f6bac12f8f723c4c8b2960f61"} Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.466337 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llx5q"] Dec 01 10:36:06 crc kubenswrapper[4763]: E1201 10:36:06.467329 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerName="extract-content" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467342 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerName="extract-content" Dec 01 10:36:06 crc kubenswrapper[4763]: E1201 10:36:06.467351 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerName="copy" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467357 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerName="copy" Dec 01 10:36:06 crc kubenswrapper[4763]: E1201 10:36:06.467368 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="extract-utilities" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467374 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="extract-utilities" Dec 01 10:36:06 crc kubenswrapper[4763]: E1201 10:36:06.467386 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="registry-server" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467393 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="registry-server" Dec 01 10:36:06 crc kubenswrapper[4763]: E1201 10:36:06.467402 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerName="extract-utilities" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467408 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerName="extract-utilities" Dec 01 10:36:06 crc kubenswrapper[4763]: E1201 10:36:06.467427 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerName="registry-server" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467433 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerName="registry-server" Dec 01 10:36:06 crc kubenswrapper[4763]: E1201 10:36:06.467442 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="extract-content" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467448 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="extract-content" Dec 01 10:36:06 crc kubenswrapper[4763]: E1201 10:36:06.467489 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerName="gather" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467495 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerName="gather" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467672 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerName="gather" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467697 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="21287b27-72e1-4209-ad3a-412d76cdd1fe" containerName="copy" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467708 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="183f4ab1-f35d-4b1c-987e-033dff124282" containerName="registry-server" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.467721 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8bd790-1990-4b82-b4c7-8828a50c1b25" containerName="registry-server" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.469040 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.476831 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llx5q"] Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.581164 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-utilities\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.581284 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52xbf\" (UniqueName: \"kubernetes.io/projected/73730f16-b873-48e2-89a3-9c645bac55f8-kube-api-access-52xbf\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.581320 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-catalog-content\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.682819 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-utilities\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.683143 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52xbf\" (UniqueName: \"kubernetes.io/projected/73730f16-b873-48e2-89a3-9c645bac55f8-kube-api-access-52xbf\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.683255 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-catalog-content\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.683427 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-utilities\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.683769 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-catalog-content\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.710526 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52xbf\" (UniqueName: \"kubernetes.io/projected/73730f16-b873-48e2-89a3-9c645bac55f8-kube-api-access-52xbf\") pod \"community-operators-llx5q\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:06 crc kubenswrapper[4763]: I1201 10:36:06.795178 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:07 crc kubenswrapper[4763]: I1201 10:36:07.326614 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llx5q"] Dec 01 10:36:07 crc kubenswrapper[4763]: I1201 10:36:07.831090 4763 generic.go:334] "Generic (PLEG): container finished" podID="73730f16-b873-48e2-89a3-9c645bac55f8" containerID="3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984" exitCode=0 Dec 01 10:36:07 crc kubenswrapper[4763]: I1201 10:36:07.831130 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llx5q" event={"ID":"73730f16-b873-48e2-89a3-9c645bac55f8","Type":"ContainerDied","Data":"3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984"} Dec 01 10:36:07 crc kubenswrapper[4763]: I1201 10:36:07.831172 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llx5q" event={"ID":"73730f16-b873-48e2-89a3-9c645bac55f8","Type":"ContainerStarted","Data":"a8a82094539cf11037780290bc65a750f56b2f05bbd0b493f8253669b107199f"} Dec 01 10:36:09 crc kubenswrapper[4763]: I1201 10:36:09.847138 4763 generic.go:334] "Generic (PLEG): container finished" podID="73730f16-b873-48e2-89a3-9c645bac55f8" containerID="2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1" exitCode=0 Dec 01 10:36:09 crc kubenswrapper[4763]: I1201 10:36:09.847174 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llx5q" event={"ID":"73730f16-b873-48e2-89a3-9c645bac55f8","Type":"ContainerDied","Data":"2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1"} Dec 01 10:36:10 crc kubenswrapper[4763]: I1201 10:36:10.861519 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llx5q" event={"ID":"73730f16-b873-48e2-89a3-9c645bac55f8","Type":"ContainerStarted","Data":"067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072"} Dec 01 10:36:10 crc kubenswrapper[4763]: I1201 10:36:10.885857 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llx5q" podStartSLOduration=2.418602414 podStartE2EDuration="4.885823627s" podCreationTimestamp="2025-12-01 10:36:06 +0000 UTC" firstStartedPulling="2025-12-01 10:36:07.833318538 +0000 UTC m=+4885.101967306" lastFinishedPulling="2025-12-01 10:36:10.300539751 +0000 UTC m=+4887.569188519" observedRunningTime="2025-12-01 10:36:10.881410187 +0000 UTC m=+4888.150058955" watchObservedRunningTime="2025-12-01 10:36:10.885823627 +0000 UTC m=+4888.154472395" Dec 01 10:36:16 crc kubenswrapper[4763]: I1201 10:36:16.797211 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:16 crc kubenswrapper[4763]: I1201 10:36:16.798944 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:16 crc kubenswrapper[4763]: I1201 10:36:16.877685 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:16 crc kubenswrapper[4763]: I1201 10:36:16.978549 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:17 crc kubenswrapper[4763]: I1201 10:36:17.115308 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llx5q"] Dec 01 10:36:18 crc kubenswrapper[4763]: I1201 10:36:18.941032 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-llx5q" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" containerName="registry-server" containerID="cri-o://067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072" gracePeriod=2 Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.438749 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.476990 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52xbf\" (UniqueName: \"kubernetes.io/projected/73730f16-b873-48e2-89a3-9c645bac55f8-kube-api-access-52xbf\") pod \"73730f16-b873-48e2-89a3-9c645bac55f8\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.477069 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-catalog-content\") pod \"73730f16-b873-48e2-89a3-9c645bac55f8\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.477174 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-utilities\") pod \"73730f16-b873-48e2-89a3-9c645bac55f8\" (UID: \"73730f16-b873-48e2-89a3-9c645bac55f8\") " Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.478636 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-utilities" (OuterVolumeSpecName: "utilities") pod "73730f16-b873-48e2-89a3-9c645bac55f8" (UID: "73730f16-b873-48e2-89a3-9c645bac55f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.485102 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73730f16-b873-48e2-89a3-9c645bac55f8-kube-api-access-52xbf" (OuterVolumeSpecName: "kube-api-access-52xbf") pod "73730f16-b873-48e2-89a3-9c645bac55f8" (UID: "73730f16-b873-48e2-89a3-9c645bac55f8"). InnerVolumeSpecName "kube-api-access-52xbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.580495 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52xbf\" (UniqueName: \"kubernetes.io/projected/73730f16-b873-48e2-89a3-9c645bac55f8-kube-api-access-52xbf\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.580530 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.763737 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73730f16-b873-48e2-89a3-9c645bac55f8" (UID: "73730f16-b873-48e2-89a3-9c645bac55f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.784278 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73730f16-b873-48e2-89a3-9c645bac55f8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.950776 4763 generic.go:334] "Generic (PLEG): container finished" podID="73730f16-b873-48e2-89a3-9c645bac55f8" containerID="067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072" exitCode=0 Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.950832 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llx5q" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.950836 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llx5q" event={"ID":"73730f16-b873-48e2-89a3-9c645bac55f8","Type":"ContainerDied","Data":"067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072"} Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.952113 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llx5q" event={"ID":"73730f16-b873-48e2-89a3-9c645bac55f8","Type":"ContainerDied","Data":"a8a82094539cf11037780290bc65a750f56b2f05bbd0b493f8253669b107199f"} Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.952135 4763 scope.go:117] "RemoveContainer" containerID="067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.973874 4763 scope.go:117] "RemoveContainer" containerID="2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1" Dec 01 10:36:19 crc kubenswrapper[4763]: I1201 10:36:19.995717 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-llx5q"] Dec 01 10:36:20 crc kubenswrapper[4763]: I1201 10:36:20.016338 4763 scope.go:117] "RemoveContainer" containerID="3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984" Dec 01 10:36:20 crc kubenswrapper[4763]: I1201 10:36:20.018831 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-llx5q"] Dec 01 10:36:20 crc kubenswrapper[4763]: I1201 10:36:20.052123 4763 scope.go:117] "RemoveContainer" containerID="067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072" Dec 01 10:36:20 crc kubenswrapper[4763]: E1201 10:36:20.052720 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072\": container with ID starting with 067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072 not found: ID does not exist" containerID="067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072" Dec 01 10:36:20 crc kubenswrapper[4763]: I1201 10:36:20.052752 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072"} err="failed to get container status \"067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072\": rpc error: code = NotFound desc = could not find container \"067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072\": container with ID starting with 067ba8758742a902ca9e7b65311c4841fb7ab0c0b1ba91e53172406c07acb072 not found: ID does not exist" Dec 01 10:36:20 crc kubenswrapper[4763]: I1201 10:36:20.052782 4763 scope.go:117] "RemoveContainer" containerID="2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1" Dec 01 10:36:20 crc kubenswrapper[4763]: E1201 10:36:20.053165 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1\": container with ID starting with 2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1 not found: ID does not exist" containerID="2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1" Dec 01 10:36:20 crc kubenswrapper[4763]: I1201 10:36:20.053192 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1"} err="failed to get container status \"2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1\": rpc error: code = NotFound desc = could not find container \"2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1\": container with ID starting with 2cf088bc50c34f1617e2bfb4c39ae1dc0f8c6db398bef6cf39d8ecd92eba60d1 not found: ID does not exist" Dec 01 10:36:20 crc kubenswrapper[4763]: I1201 10:36:20.053205 4763 scope.go:117] "RemoveContainer" containerID="3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984" Dec 01 10:36:20 crc kubenswrapper[4763]: E1201 10:36:20.053487 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984\": container with ID starting with 3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984 not found: ID does not exist" containerID="3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984" Dec 01 10:36:20 crc kubenswrapper[4763]: I1201 10:36:20.053510 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984"} err="failed to get container status \"3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984\": rpc error: code = NotFound desc = could not find container \"3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984\": container with ID starting with 3140d62023e49b999142638f32a9fce1e0ef6bbd61189a64c30f2e9d18d38984 not found: ID does not exist" Dec 01 10:36:21 crc kubenswrapper[4763]: I1201 10:36:21.003136 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" path="/var/lib/kubelet/pods/73730f16-b873-48e2-89a3-9c645bac55f8/volumes" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.098861 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n26h2/must-gather-wdhjr"] Dec 01 10:37:34 crc kubenswrapper[4763]: E1201 10:37:34.099743 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" containerName="extract-content" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.099757 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" containerName="extract-content" Dec 01 10:37:34 crc kubenswrapper[4763]: E1201 10:37:34.099794 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" containerName="extract-utilities" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.099800 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" containerName="extract-utilities" Dec 01 10:37:34 crc kubenswrapper[4763]: E1201 10:37:34.099818 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" containerName="registry-server" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.099824 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" containerName="registry-server" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.100004 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="73730f16-b873-48e2-89a3-9c645bac55f8" containerName="registry-server" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.101002 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.107953 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n26h2"/"kube-root-ca.crt" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.108192 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n26h2"/"default-dockercfg-tfvkp" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.108314 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n26h2"/"openshift-service-ca.crt" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.136294 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n26h2/must-gather-wdhjr"] Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.190560 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58a2af15-6ccd-439b-9fac-039e0f6a9342-must-gather-output\") pod \"must-gather-wdhjr\" (UID: \"58a2af15-6ccd-439b-9fac-039e0f6a9342\") " pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.190672 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4bjd\" (UniqueName: \"kubernetes.io/projected/58a2af15-6ccd-439b-9fac-039e0f6a9342-kube-api-access-v4bjd\") pod \"must-gather-wdhjr\" (UID: \"58a2af15-6ccd-439b-9fac-039e0f6a9342\") " pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.293074 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58a2af15-6ccd-439b-9fac-039e0f6a9342-must-gather-output\") pod \"must-gather-wdhjr\" (UID: \"58a2af15-6ccd-439b-9fac-039e0f6a9342\") " pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.293222 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4bjd\" (UniqueName: \"kubernetes.io/projected/58a2af15-6ccd-439b-9fac-039e0f6a9342-kube-api-access-v4bjd\") pod \"must-gather-wdhjr\" (UID: \"58a2af15-6ccd-439b-9fac-039e0f6a9342\") " pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.293492 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58a2af15-6ccd-439b-9fac-039e0f6a9342-must-gather-output\") pod \"must-gather-wdhjr\" (UID: \"58a2af15-6ccd-439b-9fac-039e0f6a9342\") " pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.313805 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4bjd\" (UniqueName: \"kubernetes.io/projected/58a2af15-6ccd-439b-9fac-039e0f6a9342-kube-api-access-v4bjd\") pod \"must-gather-wdhjr\" (UID: \"58a2af15-6ccd-439b-9fac-039e0f6a9342\") " pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.428317 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:37:34 crc kubenswrapper[4763]: I1201 10:37:34.950565 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n26h2/must-gather-wdhjr"] Dec 01 10:37:35 crc kubenswrapper[4763]: I1201 10:37:35.201476 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/must-gather-wdhjr" event={"ID":"58a2af15-6ccd-439b-9fac-039e0f6a9342","Type":"ContainerStarted","Data":"35b04df4263df52662fc9e3447b56e28ee7cd9c3cf39b067aa05435150d51355"} Dec 01 10:37:36 crc kubenswrapper[4763]: I1201 10:37:36.212299 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/must-gather-wdhjr" event={"ID":"58a2af15-6ccd-439b-9fac-039e0f6a9342","Type":"ContainerStarted","Data":"afb94e44f36ae2c089a040cf16a2e3e107aadde89d750f57ac881d77c0af6033"} Dec 01 10:37:36 crc kubenswrapper[4763]: I1201 10:37:36.212673 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/must-gather-wdhjr" event={"ID":"58a2af15-6ccd-439b-9fac-039e0f6a9342","Type":"ContainerStarted","Data":"adc12370e74f36eead0078fc9bbf84d8451aadfed6bf8b58c758509e620da6e6"} Dec 01 10:37:36 crc kubenswrapper[4763]: I1201 10:37:36.227951 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n26h2/must-gather-wdhjr" podStartSLOduration=2.227929683 podStartE2EDuration="2.227929683s" podCreationTimestamp="2025-12-01 10:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:37:36.227344747 +0000 UTC m=+4973.495993515" watchObservedRunningTime="2025-12-01 10:37:36.227929683 +0000 UTC m=+4973.496578451" Dec 01 10:37:39 crc kubenswrapper[4763]: I1201 10:37:39.853183 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n26h2/crc-debug-scchp"] Dec 01 10:37:39 crc kubenswrapper[4763]: I1201 10:37:39.854923 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:37:39 crc kubenswrapper[4763]: I1201 10:37:39.904957 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3481898-4da4-498b-b243-46bf9a4e0a25-host\") pod \"crc-debug-scchp\" (UID: \"d3481898-4da4-498b-b243-46bf9a4e0a25\") " pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:37:39 crc kubenswrapper[4763]: I1201 10:37:39.905098 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6j29\" (UniqueName: \"kubernetes.io/projected/d3481898-4da4-498b-b243-46bf9a4e0a25-kube-api-access-v6j29\") pod \"crc-debug-scchp\" (UID: \"d3481898-4da4-498b-b243-46bf9a4e0a25\") " pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:37:40 crc kubenswrapper[4763]: I1201 10:37:40.006401 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3481898-4da4-498b-b243-46bf9a4e0a25-host\") pod \"crc-debug-scchp\" (UID: \"d3481898-4da4-498b-b243-46bf9a4e0a25\") " pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:37:40 crc kubenswrapper[4763]: I1201 10:37:40.006569 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6j29\" (UniqueName: \"kubernetes.io/projected/d3481898-4da4-498b-b243-46bf9a4e0a25-kube-api-access-v6j29\") pod \"crc-debug-scchp\" (UID: \"d3481898-4da4-498b-b243-46bf9a4e0a25\") " pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:37:40 crc kubenswrapper[4763]: I1201 10:37:40.006605 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3481898-4da4-498b-b243-46bf9a4e0a25-host\") pod \"crc-debug-scchp\" (UID: \"d3481898-4da4-498b-b243-46bf9a4e0a25\") " pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:37:40 crc kubenswrapper[4763]: I1201 10:37:40.025588 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6j29\" (UniqueName: \"kubernetes.io/projected/d3481898-4da4-498b-b243-46bf9a4e0a25-kube-api-access-v6j29\") pod \"crc-debug-scchp\" (UID: \"d3481898-4da4-498b-b243-46bf9a4e0a25\") " pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:37:40 crc kubenswrapper[4763]: I1201 10:37:40.179006 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:37:40 crc kubenswrapper[4763]: I1201 10:37:40.250076 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/crc-debug-scchp" event={"ID":"d3481898-4da4-498b-b243-46bf9a4e0a25","Type":"ContainerStarted","Data":"f3eb81b6a694264db9007517646d826ece00720bebbceed7c37fa70c2307a418"} Dec 01 10:37:41 crc kubenswrapper[4763]: I1201 10:37:41.261080 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/crc-debug-scchp" event={"ID":"d3481898-4da4-498b-b243-46bf9a4e0a25","Type":"ContainerStarted","Data":"8e6e073ca9c2bdb7b148726f4527104504674241328104ce68cea114a22eb8f9"} Dec 01 10:37:41 crc kubenswrapper[4763]: I1201 10:37:41.280996 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n26h2/crc-debug-scchp" podStartSLOduration=2.280978732 podStartE2EDuration="2.280978732s" podCreationTimestamp="2025-12-01 10:37:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-01 10:37:41.27284802 +0000 UTC m=+4978.541496788" watchObservedRunningTime="2025-12-01 10:37:41.280978732 +0000 UTC m=+4978.549627500" Dec 01 10:38:03 crc kubenswrapper[4763]: I1201 10:38:03.929694 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:38:03 crc kubenswrapper[4763]: I1201 10:38:03.930311 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:38:16 crc kubenswrapper[4763]: I1201 10:38:16.594536 4763 generic.go:334] "Generic (PLEG): container finished" podID="d3481898-4da4-498b-b243-46bf9a4e0a25" containerID="8e6e073ca9c2bdb7b148726f4527104504674241328104ce68cea114a22eb8f9" exitCode=0 Dec 01 10:38:16 crc kubenswrapper[4763]: I1201 10:38:16.594607 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/crc-debug-scchp" event={"ID":"d3481898-4da4-498b-b243-46bf9a4e0a25","Type":"ContainerDied","Data":"8e6e073ca9c2bdb7b148726f4527104504674241328104ce68cea114a22eb8f9"} Dec 01 10:38:17 crc kubenswrapper[4763]: I1201 10:38:17.710697 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:38:17 crc kubenswrapper[4763]: I1201 10:38:17.745842 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n26h2/crc-debug-scchp"] Dec 01 10:38:17 crc kubenswrapper[4763]: I1201 10:38:17.753752 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n26h2/crc-debug-scchp"] Dec 01 10:38:17 crc kubenswrapper[4763]: I1201 10:38:17.800806 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3481898-4da4-498b-b243-46bf9a4e0a25-host\") pod \"d3481898-4da4-498b-b243-46bf9a4e0a25\" (UID: \"d3481898-4da4-498b-b243-46bf9a4e0a25\") " Dec 01 10:38:17 crc kubenswrapper[4763]: I1201 10:38:17.800974 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3481898-4da4-498b-b243-46bf9a4e0a25-host" (OuterVolumeSpecName: "host") pod "d3481898-4da4-498b-b243-46bf9a4e0a25" (UID: "d3481898-4da4-498b-b243-46bf9a4e0a25"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:38:17 crc kubenswrapper[4763]: I1201 10:38:17.801718 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3481898-4da4-498b-b243-46bf9a4e0a25-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:17 crc kubenswrapper[4763]: I1201 10:38:17.902771 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6j29\" (UniqueName: \"kubernetes.io/projected/d3481898-4da4-498b-b243-46bf9a4e0a25-kube-api-access-v6j29\") pod \"d3481898-4da4-498b-b243-46bf9a4e0a25\" (UID: \"d3481898-4da4-498b-b243-46bf9a4e0a25\") " Dec 01 10:38:17 crc kubenswrapper[4763]: I1201 10:38:17.909049 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3481898-4da4-498b-b243-46bf9a4e0a25-kube-api-access-v6j29" (OuterVolumeSpecName: "kube-api-access-v6j29") pod "d3481898-4da4-498b-b243-46bf9a4e0a25" (UID: "d3481898-4da4-498b-b243-46bf9a4e0a25"). InnerVolumeSpecName "kube-api-access-v6j29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:18 crc kubenswrapper[4763]: I1201 10:38:18.005030 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6j29\" (UniqueName: \"kubernetes.io/projected/d3481898-4da4-498b-b243-46bf9a4e0a25-kube-api-access-v6j29\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:18 crc kubenswrapper[4763]: I1201 10:38:18.615443 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3eb81b6a694264db9007517646d826ece00720bebbceed7c37fa70c2307a418" Dec 01 10:38:18 crc kubenswrapper[4763]: I1201 10:38:18.615634 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-scchp" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.007391 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3481898-4da4-498b-b243-46bf9a4e0a25" path="/var/lib/kubelet/pods/d3481898-4da4-498b-b243-46bf9a4e0a25/volumes" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.019186 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n26h2/crc-debug-5h8xs"] Dec 01 10:38:19 crc kubenswrapper[4763]: E1201 10:38:19.019724 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3481898-4da4-498b-b243-46bf9a4e0a25" containerName="container-00" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.019754 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3481898-4da4-498b-b243-46bf9a4e0a25" containerName="container-00" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.020022 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3481898-4da4-498b-b243-46bf9a4e0a25" containerName="container-00" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.021001 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.025385 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mbl5\" (UniqueName: \"kubernetes.io/projected/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-kube-api-access-6mbl5\") pod \"crc-debug-5h8xs\" (UID: \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\") " pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.025593 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-host\") pod \"crc-debug-5h8xs\" (UID: \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\") " pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.126276 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mbl5\" (UniqueName: \"kubernetes.io/projected/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-kube-api-access-6mbl5\") pod \"crc-debug-5h8xs\" (UID: \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\") " pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.126376 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-host\") pod \"crc-debug-5h8xs\" (UID: \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\") " pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.126545 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-host\") pod \"crc-debug-5h8xs\" (UID: \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\") " pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.146691 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mbl5\" (UniqueName: \"kubernetes.io/projected/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-kube-api-access-6mbl5\") pod \"crc-debug-5h8xs\" (UID: \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\") " pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.348258 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:19 crc kubenswrapper[4763]: W1201 10:38:19.379907 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod786acbd6_5a64_4d7f_b73a_9c68e4daf1cc.slice/crio-8d2434d37d08fa01a68b1a9d6164454f2513f7275abb83df4fa8d1054dcc6cf8 WatchSource:0}: Error finding container 8d2434d37d08fa01a68b1a9d6164454f2513f7275abb83df4fa8d1054dcc6cf8: Status 404 returned error can't find the container with id 8d2434d37d08fa01a68b1a9d6164454f2513f7275abb83df4fa8d1054dcc6cf8 Dec 01 10:38:19 crc kubenswrapper[4763]: I1201 10:38:19.634081 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/crc-debug-5h8xs" event={"ID":"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc","Type":"ContainerStarted","Data":"8d2434d37d08fa01a68b1a9d6164454f2513f7275abb83df4fa8d1054dcc6cf8"} Dec 01 10:38:20 crc kubenswrapper[4763]: I1201 10:38:20.644412 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/crc-debug-5h8xs" event={"ID":"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc","Type":"ContainerStarted","Data":"0837fb9d86249d386c56b3baa4dd8372cb24abde41c4c0cf0fddee5e71223b8a"} Dec 01 10:38:21 crc kubenswrapper[4763]: I1201 10:38:21.658856 4763 generic.go:334] "Generic (PLEG): container finished" podID="786acbd6-5a64-4d7f-b73a-9c68e4daf1cc" containerID="0837fb9d86249d386c56b3baa4dd8372cb24abde41c4c0cf0fddee5e71223b8a" exitCode=0 Dec 01 10:38:21 crc kubenswrapper[4763]: I1201 10:38:21.659122 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/crc-debug-5h8xs" event={"ID":"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc","Type":"ContainerDied","Data":"0837fb9d86249d386c56b3baa4dd8372cb24abde41c4c0cf0fddee5e71223b8a"} Dec 01 10:38:21 crc kubenswrapper[4763]: I1201 10:38:21.986440 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n26h2/crc-debug-5h8xs"] Dec 01 10:38:21 crc kubenswrapper[4763]: I1201 10:38:21.995369 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n26h2/crc-debug-5h8xs"] Dec 01 10:38:22 crc kubenswrapper[4763]: I1201 10:38:22.790495 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:22 crc kubenswrapper[4763]: I1201 10:38:22.911976 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-host\") pod \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\" (UID: \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\") " Dec 01 10:38:22 crc kubenswrapper[4763]: I1201 10:38:22.912035 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mbl5\" (UniqueName: \"kubernetes.io/projected/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-kube-api-access-6mbl5\") pod \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\" (UID: \"786acbd6-5a64-4d7f-b73a-9c68e4daf1cc\") " Dec 01 10:38:22 crc kubenswrapper[4763]: I1201 10:38:22.913165 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-host" (OuterVolumeSpecName: "host") pod "786acbd6-5a64-4d7f-b73a-9c68e4daf1cc" (UID: "786acbd6-5a64-4d7f-b73a-9c68e4daf1cc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:38:22 crc kubenswrapper[4763]: I1201 10:38:22.920299 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-kube-api-access-6mbl5" (OuterVolumeSpecName: "kube-api-access-6mbl5") pod "786acbd6-5a64-4d7f-b73a-9c68e4daf1cc" (UID: "786acbd6-5a64-4d7f-b73a-9c68e4daf1cc"). InnerVolumeSpecName "kube-api-access-6mbl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.004816 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="786acbd6-5a64-4d7f-b73a-9c68e4daf1cc" path="/var/lib/kubelet/pods/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc/volumes" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.013996 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.014222 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mbl5\" (UniqueName: \"kubernetes.io/projected/786acbd6-5a64-4d7f-b73a-9c68e4daf1cc-kube-api-access-6mbl5\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.213187 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n26h2/crc-debug-9tsnl"] Dec 01 10:38:23 crc kubenswrapper[4763]: E1201 10:38:23.213659 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786acbd6-5a64-4d7f-b73a-9c68e4daf1cc" containerName="container-00" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.213682 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="786acbd6-5a64-4d7f-b73a-9c68e4daf1cc" containerName="container-00" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.213933 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="786acbd6-5a64-4d7f-b73a-9c68e4daf1cc" containerName="container-00" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.214692 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.217507 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8640634b-a87c-401b-aac0-11e53eb82be4-host\") pod \"crc-debug-9tsnl\" (UID: \"8640634b-a87c-401b-aac0-11e53eb82be4\") " pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.217734 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwsdt\" (UniqueName: \"kubernetes.io/projected/8640634b-a87c-401b-aac0-11e53eb82be4-kube-api-access-vwsdt\") pod \"crc-debug-9tsnl\" (UID: \"8640634b-a87c-401b-aac0-11e53eb82be4\") " pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.319110 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8640634b-a87c-401b-aac0-11e53eb82be4-host\") pod \"crc-debug-9tsnl\" (UID: \"8640634b-a87c-401b-aac0-11e53eb82be4\") " pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.319276 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwsdt\" (UniqueName: \"kubernetes.io/projected/8640634b-a87c-401b-aac0-11e53eb82be4-kube-api-access-vwsdt\") pod \"crc-debug-9tsnl\" (UID: \"8640634b-a87c-401b-aac0-11e53eb82be4\") " pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.319587 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8640634b-a87c-401b-aac0-11e53eb82be4-host\") pod \"crc-debug-9tsnl\" (UID: \"8640634b-a87c-401b-aac0-11e53eb82be4\") " pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.344713 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwsdt\" (UniqueName: \"kubernetes.io/projected/8640634b-a87c-401b-aac0-11e53eb82be4-kube-api-access-vwsdt\") pod \"crc-debug-9tsnl\" (UID: \"8640634b-a87c-401b-aac0-11e53eb82be4\") " pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.531882 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:23 crc kubenswrapper[4763]: W1201 10:38:23.563750 4763 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8640634b_a87c_401b_aac0_11e53eb82be4.slice/crio-2f8bf44b87c38cac44cf090704e5438e553e7a2599ac373d9e0e6a26d6418df5 WatchSource:0}: Error finding container 2f8bf44b87c38cac44cf090704e5438e553e7a2599ac373d9e0e6a26d6418df5: Status 404 returned error can't find the container with id 2f8bf44b87c38cac44cf090704e5438e553e7a2599ac373d9e0e6a26d6418df5 Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.679606 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/crc-debug-9tsnl" event={"ID":"8640634b-a87c-401b-aac0-11e53eb82be4","Type":"ContainerStarted","Data":"2f8bf44b87c38cac44cf090704e5438e553e7a2599ac373d9e0e6a26d6418df5"} Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.682786 4763 scope.go:117] "RemoveContainer" containerID="0837fb9d86249d386c56b3baa4dd8372cb24abde41c4c0cf0fddee5e71223b8a" Dec 01 10:38:23 crc kubenswrapper[4763]: I1201 10:38:23.682972 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-5h8xs" Dec 01 10:38:24 crc kubenswrapper[4763]: I1201 10:38:24.691751 4763 generic.go:334] "Generic (PLEG): container finished" podID="8640634b-a87c-401b-aac0-11e53eb82be4" containerID="fe1fa0e288216a85f96361692d599fccb86d34c643f5eb80cd1fd7d8aa37ef0e" exitCode=0 Dec 01 10:38:24 crc kubenswrapper[4763]: I1201 10:38:24.691896 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/crc-debug-9tsnl" event={"ID":"8640634b-a87c-401b-aac0-11e53eb82be4","Type":"ContainerDied","Data":"fe1fa0e288216a85f96361692d599fccb86d34c643f5eb80cd1fd7d8aa37ef0e"} Dec 01 10:38:24 crc kubenswrapper[4763]: I1201 10:38:24.726489 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n26h2/crc-debug-9tsnl"] Dec 01 10:38:24 crc kubenswrapper[4763]: I1201 10:38:24.735248 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n26h2/crc-debug-9tsnl"] Dec 01 10:38:25 crc kubenswrapper[4763]: I1201 10:38:25.821778 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:25 crc kubenswrapper[4763]: I1201 10:38:25.974101 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8640634b-a87c-401b-aac0-11e53eb82be4-host\") pod \"8640634b-a87c-401b-aac0-11e53eb82be4\" (UID: \"8640634b-a87c-401b-aac0-11e53eb82be4\") " Dec 01 10:38:25 crc kubenswrapper[4763]: I1201 10:38:25.974237 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8640634b-a87c-401b-aac0-11e53eb82be4-host" (OuterVolumeSpecName: "host") pod "8640634b-a87c-401b-aac0-11e53eb82be4" (UID: "8640634b-a87c-401b-aac0-11e53eb82be4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 01 10:38:25 crc kubenswrapper[4763]: I1201 10:38:25.974287 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwsdt\" (UniqueName: \"kubernetes.io/projected/8640634b-a87c-401b-aac0-11e53eb82be4-kube-api-access-vwsdt\") pod \"8640634b-a87c-401b-aac0-11e53eb82be4\" (UID: \"8640634b-a87c-401b-aac0-11e53eb82be4\") " Dec 01 10:38:25 crc kubenswrapper[4763]: I1201 10:38:25.974970 4763 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8640634b-a87c-401b-aac0-11e53eb82be4-host\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:25 crc kubenswrapper[4763]: I1201 10:38:25.985412 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8640634b-a87c-401b-aac0-11e53eb82be4-kube-api-access-vwsdt" (OuterVolumeSpecName: "kube-api-access-vwsdt") pod "8640634b-a87c-401b-aac0-11e53eb82be4" (UID: "8640634b-a87c-401b-aac0-11e53eb82be4"). InnerVolumeSpecName "kube-api-access-vwsdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:38:26 crc kubenswrapper[4763]: I1201 10:38:26.077071 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwsdt\" (UniqueName: \"kubernetes.io/projected/8640634b-a87c-401b-aac0-11e53eb82be4-kube-api-access-vwsdt\") on node \"crc\" DevicePath \"\"" Dec 01 10:38:26 crc kubenswrapper[4763]: I1201 10:38:26.711510 4763 scope.go:117] "RemoveContainer" containerID="fe1fa0e288216a85f96361692d599fccb86d34c643f5eb80cd1fd7d8aa37ef0e" Dec 01 10:38:26 crc kubenswrapper[4763]: I1201 10:38:26.711522 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/crc-debug-9tsnl" Dec 01 10:38:27 crc kubenswrapper[4763]: I1201 10:38:27.005992 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8640634b-a87c-401b-aac0-11e53eb82be4" path="/var/lib/kubelet/pods/8640634b-a87c-401b-aac0-11e53eb82be4/volumes" Dec 01 10:38:33 crc kubenswrapper[4763]: I1201 10:38:33.929695 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:38:33 crc kubenswrapper[4763]: I1201 10:38:33.930300 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:39:03 crc kubenswrapper[4763]: I1201 10:39:03.929097 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:39:03 crc kubenswrapper[4763]: I1201 10:39:03.929683 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:39:03 crc kubenswrapper[4763]: I1201 10:39:03.929732 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 10:39:03 crc kubenswrapper[4763]: I1201 10:39:03.930546 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b11377138eb89562b1cf83f3e428880f88f84a0f6bac12f8f723c4c8b2960f61"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:39:03 crc kubenswrapper[4763]: I1201 10:39:03.930593 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://b11377138eb89562b1cf83f3e428880f88f84a0f6bac12f8f723c4c8b2960f61" gracePeriod=600 Dec 01 10:39:05 crc kubenswrapper[4763]: I1201 10:39:05.055023 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="b11377138eb89562b1cf83f3e428880f88f84a0f6bac12f8f723c4c8b2960f61" exitCode=0 Dec 01 10:39:05 crc kubenswrapper[4763]: I1201 10:39:05.055178 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"b11377138eb89562b1cf83f3e428880f88f84a0f6bac12f8f723c4c8b2960f61"} Dec 01 10:39:05 crc kubenswrapper[4763]: I1201 10:39:05.055593 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f"} Dec 01 10:39:05 crc kubenswrapper[4763]: I1201 10:39:05.055615 4763 scope.go:117] "RemoveContainer" containerID="35e572652d04b2c98ad92bb6b25303f3b28cb9d05845efa719295213019237e2" Dec 01 10:39:41 crc kubenswrapper[4763]: I1201 10:39:41.254751 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d56769fd6-btjcl_6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2/barbican-api/0.log" Dec 01 10:39:41 crc kubenswrapper[4763]: I1201 10:39:41.322349 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-d56769fd6-btjcl_6c487c78-2bc9-4d3b-82ed-5b07a6f7aac2/barbican-api-log/0.log" Dec 01 10:39:41 crc kubenswrapper[4763]: I1201 10:39:41.481642 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7db8c7446b-kkdjv_2a3e2134-8fc4-4ec2-9970-92959ed1778e/barbican-keystone-listener/0.log" Dec 01 10:39:41 crc kubenswrapper[4763]: I1201 10:39:41.577057 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7db8c7446b-kkdjv_2a3e2134-8fc4-4ec2-9970-92959ed1778e/barbican-keystone-listener-log/0.log" Dec 01 10:39:41 crc kubenswrapper[4763]: I1201 10:39:41.626602 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6948ddcbd7-94xjg_9dac2a9d-f2f8-4167-8f68-f01c9364a59f/barbican-worker/0.log" Dec 01 10:39:41 crc kubenswrapper[4763]: I1201 10:39:41.724426 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6948ddcbd7-94xjg_9dac2a9d-f2f8-4167-8f68-f01c9364a59f/barbican-worker-log/0.log" Dec 01 10:39:41 crc kubenswrapper[4763]: I1201 10:39:41.905368 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ph9jj_249234f7-8f79-4a99-a35b-d43677150bf6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:42 crc kubenswrapper[4763]: I1201 10:39:42.014850 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fcb70e3b-87a0-49ff-8946-e182808bf846/ceilometer-central-agent/0.log" Dec 01 10:39:42 crc kubenswrapper[4763]: I1201 10:39:42.244843 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fcb70e3b-87a0-49ff-8946-e182808bf846/proxy-httpd/0.log" Dec 01 10:39:42 crc kubenswrapper[4763]: I1201 10:39:42.293039 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fcb70e3b-87a0-49ff-8946-e182808bf846/ceilometer-notification-agent/0.log" Dec 01 10:39:42 crc kubenswrapper[4763]: I1201 10:39:42.327093 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_fcb70e3b-87a0-49ff-8946-e182808bf846/sg-core/0.log" Dec 01 10:39:42 crc kubenswrapper[4763]: I1201 10:39:42.549327 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-7c4lp_93089553-1487-4d3b-ab46-1cd7822aa6ad/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:42 crc kubenswrapper[4763]: I1201 10:39:42.551007 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-4rbzc_e060abda-70ed-4adb-8756-15046c2a2f9d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:42 crc kubenswrapper[4763]: I1201 10:39:42.809754 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_054a3443-a215-4987-993e-ea2d282c1d0d/cinder-api-log/0.log" Dec 01 10:39:42 crc kubenswrapper[4763]: I1201 10:39:42.897237 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_054a3443-a215-4987-993e-ea2d282c1d0d/cinder-api/0.log" Dec 01 10:39:43 crc kubenswrapper[4763]: I1201 10:39:43.164170 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_60bdb953-6735-4247-8287-16dbf4187c03/cinder-backup/0.log" Dec 01 10:39:43 crc kubenswrapper[4763]: I1201 10:39:43.202601 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_60bdb953-6735-4247-8287-16dbf4187c03/probe/0.log" Dec 01 10:39:43 crc kubenswrapper[4763]: I1201 10:39:43.402604 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e7475a3a-70b9-44d0-94b2-3c3890185f85/cinder-scheduler/0.log" Dec 01 10:39:43 crc kubenswrapper[4763]: I1201 10:39:43.764211 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e7475a3a-70b9-44d0-94b2-3c3890185f85/probe/0.log" Dec 01 10:39:43 crc kubenswrapper[4763]: I1201 10:39:43.842084 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_1a72cdbd-892b-459d-86e2-0dde31be5e39/probe/0.log" Dec 01 10:39:43 crc kubenswrapper[4763]: I1201 10:39:43.913555 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_1a72cdbd-892b-459d-86e2-0dde31be5e39/cinder-volume/0.log" Dec 01 10:39:44 crc kubenswrapper[4763]: I1201 10:39:44.138417 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-t2dg7_9076b862-2a04-47bc-a6f6-bb99cd48ec2b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:44 crc kubenswrapper[4763]: I1201 10:39:44.217947 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sqjqw_66848849-497d-4488-898a-c529d1ef2736/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:44 crc kubenswrapper[4763]: I1201 10:39:44.444083 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c846ff5b9-2xtfs_34405493-5281-4822-b8f1-11e68aa61470/init/0.log" Dec 01 10:39:44 crc kubenswrapper[4763]: I1201 10:39:44.649800 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c846ff5b9-2xtfs_34405493-5281-4822-b8f1-11e68aa61470/init/0.log" Dec 01 10:39:44 crc kubenswrapper[4763]: I1201 10:39:44.698835 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f34e35d3-3a43-49ee-bee1-9ccc51135eb7/glance-httpd/0.log" Dec 01 10:39:45 crc kubenswrapper[4763]: I1201 10:39:45.007028 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5c846ff5b9-2xtfs_34405493-5281-4822-b8f1-11e68aa61470/dnsmasq-dns/0.log" Dec 01 10:39:45 crc kubenswrapper[4763]: I1201 10:39:45.015417 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f34e35d3-3a43-49ee-bee1-9ccc51135eb7/glance-log/0.log" Dec 01 10:39:45 crc kubenswrapper[4763]: I1201 10:39:45.232386 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78/glance-httpd/0.log" Dec 01 10:39:45 crc kubenswrapper[4763]: I1201 10:39:45.391495 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c09ed6b4-4a4c-4c3c-8b25-05ffbf3a7b78/glance-log/0.log" Dec 01 10:39:45 crc kubenswrapper[4763]: I1201 10:39:45.518108 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c998bbd96-rw26q_28f61e56-22f7-45cc-ba59-9624e668a73d/horizon/0.log" Dec 01 10:39:45 crc kubenswrapper[4763]: I1201 10:39:45.754363 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c998bbd96-rw26q_28f61e56-22f7-45cc-ba59-9624e668a73d/horizon-log/0.log" Dec 01 10:39:45 crc kubenswrapper[4763]: I1201 10:39:45.782115 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-pxb2c_7aeeb400-9fd6-4b3f-a3f9-71ec35ac0ac4/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.052174 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-c97lf_feb1b8da-b40b-439e-a27e-3f78045bbf86/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.249516 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29409721-pwb6r_80866302-0e51-4cd8-8fb1-5106a5764fb8/keystone-cron/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.269761 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7cddbbbc75-sb8z7_f503ba10-c7e4-4615-9137-8138e0dfb3f9/keystone-api/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.381257 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_22ccd9e5-e651-4111-adfa-853c6d838d96/kube-state-metrics/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.525855 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-5ntdl_a458d267-3663-4b9e-baa3-c3711a334c80/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.685143 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6730d033-63cf-46f2-b779-e751663b7735/manila-api-log/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.765579 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6730d033-63cf-46f2-b779-e751663b7735/manila-api/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.903762 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_3cf5f5cb-a8a0-40f4-8acc-1f41415052d2/probe/0.log" Dec 01 10:39:46 crc kubenswrapper[4763]: I1201 10:39:46.940218 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_3cf5f5cb-a8a0-40f4-8acc-1f41415052d2/manila-scheduler/0.log" Dec 01 10:39:47 crc kubenswrapper[4763]: I1201 10:39:47.040132 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_18c3fc4b-5681-40d1-8b65-e85af0a1905e/manila-share/0.log" Dec 01 10:39:47 crc kubenswrapper[4763]: I1201 10:39:47.134208 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_18c3fc4b-5681-40d1-8b65-e85af0a1905e/probe/0.log" Dec 01 10:39:47 crc kubenswrapper[4763]: I1201 10:39:47.519535 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54896b4dfc-stxgl_e30619d4-84f7-4a40-aca1-b6885d608e03/neutron-api/0.log" Dec 01 10:39:47 crc kubenswrapper[4763]: I1201 10:39:47.561321 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54896b4dfc-stxgl_e30619d4-84f7-4a40-aca1-b6885d608e03/neutron-httpd/0.log" Dec 01 10:39:47 crc kubenswrapper[4763]: I1201 10:39:47.638396 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-zs6hk_a6949d90-ef2d-4555-87b8-0929fd2048b4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:48 crc kubenswrapper[4763]: I1201 10:39:48.306536 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5c652d84-294e-4f79-bbcd-37fca6657cd6/nova-api-log/0.log" Dec 01 10:39:48 crc kubenswrapper[4763]: I1201 10:39:48.381241 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a0b95396-e7dd-4b49-b465-db158816b7ea/nova-cell0-conductor-conductor/0.log" Dec 01 10:39:48 crc kubenswrapper[4763]: I1201 10:39:48.736436 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1004fbb5-ee5c-4328-bb1f-9054e4224138/nova-cell1-conductor-conductor/0.log" Dec 01 10:39:48 crc kubenswrapper[4763]: I1201 10:39:48.837794 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_34e5c16d-5a9f-43f0-a2ba-ca4a768891a7/nova-cell1-novncproxy-novncproxy/0.log" Dec 01 10:39:48 crc kubenswrapper[4763]: I1201 10:39:48.889298 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5c652d84-294e-4f79-bbcd-37fca6657cd6/nova-api-api/0.log" Dec 01 10:39:49 crc kubenswrapper[4763]: I1201 10:39:49.092503 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-bq7hk_43b000e9-21e9-47f9-8bc7-a93a8747159e/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:49 crc kubenswrapper[4763]: I1201 10:39:49.340543 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b10d5a09-4c44-4fb9-bcc0-b04612dde39c/nova-metadata-log/0.log" Dec 01 10:39:49 crc kubenswrapper[4763]: I1201 10:39:49.712867 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e/mysql-bootstrap/0.log" Dec 01 10:39:49 crc kubenswrapper[4763]: I1201 10:39:49.778392 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_002374b8-d55c-4996-9fb9-0e4fc758dc7f/nova-scheduler-scheduler/0.log" Dec 01 10:39:50 crc kubenswrapper[4763]: I1201 10:39:50.290279 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e/mysql-bootstrap/0.log" Dec 01 10:39:50 crc kubenswrapper[4763]: I1201 10:39:50.891850 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7ad5d50f-9d2b-49ea-b2e2-3a03fbb3d17e/galera/0.log" Dec 01 10:39:51 crc kubenswrapper[4763]: I1201 10:39:51.027024 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fc05c35a-b504-4104-a515-737272f6b4d9/mysql-bootstrap/0.log" Dec 01 10:39:51 crc kubenswrapper[4763]: I1201 10:39:51.247957 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fc05c35a-b504-4104-a515-737272f6b4d9/mysql-bootstrap/0.log" Dec 01 10:39:51 crc kubenswrapper[4763]: I1201 10:39:51.296225 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fc05c35a-b504-4104-a515-737272f6b4d9/galera/0.log" Dec 01 10:39:51 crc kubenswrapper[4763]: I1201 10:39:51.403603 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b10d5a09-4c44-4fb9-bcc0-b04612dde39c/nova-metadata-metadata/0.log" Dec 01 10:39:51 crc kubenswrapper[4763]: I1201 10:39:51.477279 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b22eceb2-ee23-4a6c-a993-bb280fe2d41f/openstackclient/0.log" Dec 01 10:39:51 crc kubenswrapper[4763]: I1201 10:39:51.630510 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-26n5d_68a1c130-7d5e-4679-9ec7-dd63b84cc8d5/ovn-controller/0.log" Dec 01 10:39:51 crc kubenswrapper[4763]: I1201 10:39:51.715909 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-4p79n_065901e9-140d-472b-8ed7-6e800f992c73/openstack-network-exporter/0.log" Dec 01 10:39:51 crc kubenswrapper[4763]: I1201 10:39:51.959606 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d2z4q_ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7/ovsdb-server-init/0.log" Dec 01 10:39:52 crc kubenswrapper[4763]: I1201 10:39:52.486191 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d2z4q_ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7/ovsdb-server-init/0.log" Dec 01 10:39:52 crc kubenswrapper[4763]: I1201 10:39:52.547970 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d2z4q_ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7/ovsdb-server/0.log" Dec 01 10:39:52 crc kubenswrapper[4763]: I1201 10:39:52.596901 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-d2z4q_ac4b5c0f-58b0-4eab-95bc-27bee39cd7a7/ovs-vswitchd/0.log" Dec 01 10:39:52 crc kubenswrapper[4763]: I1201 10:39:52.833864 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc/openstack-network-exporter/0.log" Dec 01 10:39:52 crc kubenswrapper[4763]: I1201 10:39:52.838308 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7d82w_be3fab30-e99d-4b1a-ba2c-86326fbeb363/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:53 crc kubenswrapper[4763]: I1201 10:39:53.024831 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f3ed8a90-9c7d-46aa-9f2d-45e16fe734dc/ovn-northd/0.log" Dec 01 10:39:53 crc kubenswrapper[4763]: I1201 10:39:53.211481 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fe31b02-1b96-439f-bc58-9d2d2700d35b/openstack-network-exporter/0.log" Dec 01 10:39:53 crc kubenswrapper[4763]: I1201 10:39:53.375681 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0fe31b02-1b96-439f-bc58-9d2d2700d35b/ovsdbserver-nb/0.log" Dec 01 10:39:53 crc kubenswrapper[4763]: I1201 10:39:53.460090 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2aab79d8-d046-4877-9fa0-12d87132a99f/openstack-network-exporter/0.log" Dec 01 10:39:53 crc kubenswrapper[4763]: I1201 10:39:53.600171 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2aab79d8-d046-4877-9fa0-12d87132a99f/ovsdbserver-sb/0.log" Dec 01 10:39:53 crc kubenswrapper[4763]: I1201 10:39:53.764670 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bbd79555b-kk8vr_cdaab12f-7433-420e-bdf0-99ef2e2f5707/placement-api/0.log" Dec 01 10:39:53 crc kubenswrapper[4763]: I1201 10:39:53.910870 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bbd79555b-kk8vr_cdaab12f-7433-420e-bdf0-99ef2e2f5707/placement-log/0.log" Dec 01 10:39:53 crc kubenswrapper[4763]: I1201 10:39:53.985650 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d10e5ae-f63a-4bdf-b3f5-2f99e6856799/setup-container/0.log" Dec 01 10:39:54 crc kubenswrapper[4763]: I1201 10:39:54.254655 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d10e5ae-f63a-4bdf-b3f5-2f99e6856799/setup-container/0.log" Dec 01 10:39:54 crc kubenswrapper[4763]: I1201 10:39:54.393348 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6051a720-a09e-4c11-a9c4-465be3730f65/setup-container/0.log" Dec 01 10:39:54 crc kubenswrapper[4763]: I1201 10:39:54.397502 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d10e5ae-f63a-4bdf-b3f5-2f99e6856799/rabbitmq/0.log" Dec 01 10:39:54 crc kubenswrapper[4763]: I1201 10:39:54.609074 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6051a720-a09e-4c11-a9c4-465be3730f65/rabbitmq/0.log" Dec 01 10:39:54 crc kubenswrapper[4763]: I1201 10:39:54.726920 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6051a720-a09e-4c11-a9c4-465be3730f65/setup-container/0.log" Dec 01 10:39:54 crc kubenswrapper[4763]: I1201 10:39:54.728069 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-xxqsx_2308a6d4-3af7-4772-8413-3803ac516e1c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:55 crc kubenswrapper[4763]: I1201 10:39:55.080003 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-49bzm_09695b88-6f6f-469a-b41a-02cd50e1f216/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:55 crc kubenswrapper[4763]: I1201 10:39:55.193341 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zbl9m_b41d9a03-82fb-4f14-b13c-0437ae28a1a7/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:39:55 crc kubenswrapper[4763]: I1201 10:39:55.383295 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-q84bf_57f56dbe-6a50-4a2b-9e50-2a6b87d9afeb/ssh-known-hosts-edpm-deployment/0.log" Dec 01 10:39:55 crc kubenswrapper[4763]: I1201 10:39:55.581087 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_74d118c3-e544-4a7a-ad22-de496e16f9ee/tempest-tests-tempest-tests-runner/0.log" Dec 01 10:39:55 crc kubenswrapper[4763]: I1201 10:39:55.778204 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c0ee43b4-efab-48de-ba62-dedd80d83711/test-operator-logs-container/0.log" Dec 01 10:39:56 crc kubenswrapper[4763]: I1201 10:39:56.062001 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-sh4nv_0fee9e86-e1af-4201-9817-bf22f5910477/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 01 10:40:09 crc kubenswrapper[4763]: I1201 10:40:09.984120 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f17760ee-44e7-4bf5-b9d6-368f9b780426/memcached/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.271147 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/util/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.423164 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/util/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.441205 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/pull/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.498607 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/pull/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.694116 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/extract/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.705592 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/util/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.744624 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_85fd02cafdc0b2fad1e83db46281f63c6837568b03212b06e8a12311fddqbxr_40baea6c-c32e-4f93-b01e-d94c309c05f7/pull/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.922571 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qh7dr_1208b653-3551-4266-99b9-e83fb86b4771/kube-rbac-proxy/0.log" Dec 01 10:40:30 crc kubenswrapper[4763]: I1201 10:40:30.944672 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-qh7dr_1208b653-3551-4266-99b9-e83fb86b4771/manager/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.010360 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-nh5mz_1c980f4b-c55c-4a2e-9461-9f89ec0165c3/kube-rbac-proxy/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.255923 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-nh5mz_1c980f4b-c55c-4a2e-9461-9f89ec0165c3/manager/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.285879 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-l547d_ee716572-1b36-4216-84a6-ad3f4ac2b7f6/manager/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.299136 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-l547d_ee716572-1b36-4216-84a6-ad3f4ac2b7f6/kube-rbac-proxy/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.536245 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-xx8m2_5325eff2-4650-499e-9cad-f486bae74fce/kube-rbac-proxy/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.590429 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-xx8m2_5325eff2-4650-499e-9cad-f486bae74fce/manager/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.743966 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rjzdx_eae2e950-9f81-49cc-926e-380b81a0f0e7/kube-rbac-proxy/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.840090 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-rjzdx_eae2e950-9f81-49cc-926e-380b81a0f0e7/manager/0.log" Dec 01 10:40:31 crc kubenswrapper[4763]: I1201 10:40:31.964135 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dbhs6_1e77763b-639f-46aa-a798-e39251aa8636/kube-rbac-proxy/0.log" Dec 01 10:40:32 crc kubenswrapper[4763]: I1201 10:40:32.120053 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-dbhs6_1e77763b-639f-46aa-a798-e39251aa8636/manager/0.log" Dec 01 10:40:32 crc kubenswrapper[4763]: I1201 10:40:32.168571 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-9vjrk_2aecf763-7a48-4c6f-a66c-ea391befd47a/kube-rbac-proxy/0.log" Dec 01 10:40:32 crc kubenswrapper[4763]: I1201 10:40:32.442827 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-9vjrk_2aecf763-7a48-4c6f-a66c-ea391befd47a/manager/0.log" Dec 01 10:40:32 crc kubenswrapper[4763]: I1201 10:40:32.516015 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4srbr_c0ed7161-2907-48a6-894d-c6e3a1f47e0e/kube-rbac-proxy/0.log" Dec 01 10:40:32 crc kubenswrapper[4763]: I1201 10:40:32.584979 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-4srbr_c0ed7161-2907-48a6-894d-c6e3a1f47e0e/manager/0.log" Dec 01 10:40:32 crc kubenswrapper[4763]: I1201 10:40:32.763149 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-t74fr_f5580ab2-73e2-4766-8e9c-f217fd4c079d/kube-rbac-proxy/0.log" Dec 01 10:40:32 crc kubenswrapper[4763]: I1201 10:40:32.820508 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-t74fr_f5580ab2-73e2-4766-8e9c-f217fd4c079d/manager/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.025936 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-9x2q4_4e44f450-61c5-4f49-b16b-8c9e0f060879/kube-rbac-proxy/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.084999 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-9x2q4_4e44f450-61c5-4f49-b16b-8c9e0f060879/manager/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.133795 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-nb5kc_9eef10a1-bfcc-412c-9687-fee23d90d448/kube-rbac-proxy/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.311807 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-9f5r9_a0713966-4e10-4b8b-84bc-6560d1b1bf5a/kube-rbac-proxy/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.340805 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-nb5kc_9eef10a1-bfcc-412c-9687-fee23d90d448/manager/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.443833 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-9f5r9_a0713966-4e10-4b8b-84bc-6560d1b1bf5a/manager/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.611286 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-jxvhd_0ee4b811-c59a-4120-bf78-53fe9e049d4b/kube-rbac-proxy/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.769284 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-jbvjc_ddef7d32-1d4c-496d-be36-7ae7af64205a/kube-rbac-proxy/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.778878 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-jxvhd_0ee4b811-c59a-4120-bf78-53fe9e049d4b/manager/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.852962 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-jbvjc_ddef7d32-1d4c-496d-be36-7ae7af64205a/manager/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.997488 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj_e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0/kube-rbac-proxy/0.log" Dec 01 10:40:33 crc kubenswrapper[4763]: I1201 10:40:33.997721 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4z57lj_e9b6a6e4-ad2c-4f39-bf7f-777e5ddb62d0/manager/0.log" Dec 01 10:40:34 crc kubenswrapper[4763]: I1201 10:40:34.431647 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-4qcm5_f78f08c4-82f9-45c0-92d4-325c6e066d44/registry-server/0.log" Dec 01 10:40:34 crc kubenswrapper[4763]: I1201 10:40:34.535379 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5794fdf75-8f5zw_de261a18-aec0-4ea5-aaf9-e313631599e6/operator/0.log" Dec 01 10:40:34 crc kubenswrapper[4763]: I1201 10:40:34.689667 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-k2nzk_c274cd4c-0c77-485c-8d8f-116a2f7b013b/kube-rbac-proxy/0.log" Dec 01 10:40:34 crc kubenswrapper[4763]: I1201 10:40:34.838437 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-k2nzk_c274cd4c-0c77-485c-8d8f-116a2f7b013b/manager/0.log" Dec 01 10:40:34 crc kubenswrapper[4763]: I1201 10:40:34.896512 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-tk5xf_13a2ac2b-0374-4da0-abbf-6aecbc3afbb8/kube-rbac-proxy/0.log" Dec 01 10:40:34 crc kubenswrapper[4763]: I1201 10:40:34.952987 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-tk5xf_13a2ac2b-0374-4da0-abbf-6aecbc3afbb8/manager/0.log" Dec 01 10:40:35 crc kubenswrapper[4763]: I1201 10:40:35.071551 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-s4rlp_6b7d748f-a9e4-416a-8fd7-9fa46ca2060d/operator/0.log" Dec 01 10:40:35 crc kubenswrapper[4763]: I1201 10:40:35.276223 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d555457c4-jcpzh_dd971a72-ce63-45cb-9457-43fcea25f677/manager/0.log" Dec 01 10:40:35 crc kubenswrapper[4763]: I1201 10:40:35.298996 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-mqvfc_20d8f0e3-8406-4e55-adbf-0681e090a82e/kube-rbac-proxy/0.log" Dec 01 10:40:35 crc kubenswrapper[4763]: I1201 10:40:35.323529 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-mqvfc_20d8f0e3-8406-4e55-adbf-0681e090a82e/manager/0.log" Dec 01 10:40:35 crc kubenswrapper[4763]: I1201 10:40:35.795131 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2skdl_89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0/kube-rbac-proxy/0.log" Dec 01 10:40:35 crc kubenswrapper[4763]: I1201 10:40:35.869761 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-64k9k_664dabb5-40f4-44c4-be9d-1870e153c877/kube-rbac-proxy/0.log" Dec 01 10:40:35 crc kubenswrapper[4763]: I1201 10:40:35.876028 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-2skdl_89ab6c2d-ac49-4ec8-8b4a-34ec58416dd0/manager/0.log" Dec 01 10:40:36 crc kubenswrapper[4763]: I1201 10:40:36.023686 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-64k9k_664dabb5-40f4-44c4-be9d-1870e153c877/manager/0.log" Dec 01 10:40:36 crc kubenswrapper[4763]: I1201 10:40:36.136425 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-ctr5d_3f494774-a168-4199-bfff-e73f64a669cf/kube-rbac-proxy/0.log" Dec 01 10:40:36 crc kubenswrapper[4763]: I1201 10:40:36.204838 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-ctr5d_3f494774-a168-4199-bfff-e73f64a669cf/manager/0.log" Dec 01 10:40:57 crc kubenswrapper[4763]: I1201 10:40:57.434022 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-l2f2g_f6329e56-18d1-4479-8699-897fdfdc60fb/control-plane-machine-set-operator/0.log" Dec 01 10:40:57 crc kubenswrapper[4763]: I1201 10:40:57.582600 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l4kcj_2307c38a-2af7-4b03-b99a-e5ca5bed76a8/kube-rbac-proxy/0.log" Dec 01 10:40:57 crc kubenswrapper[4763]: I1201 10:40:57.661088 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-l4kcj_2307c38a-2af7-4b03-b99a-e5ca5bed76a8/machine-api-operator/0.log" Dec 01 10:41:11 crc kubenswrapper[4763]: I1201 10:41:11.151708 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-s2hg7_f4a5555c-4f44-4a2e-a9bf-6daab8490e32/cert-manager-controller/0.log" Dec 01 10:41:11 crc kubenswrapper[4763]: I1201 10:41:11.238934 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-tq2xd_99f9aa79-5bae-4215-b137-baef58e56e96/cert-manager-cainjector/0.log" Dec 01 10:41:11 crc kubenswrapper[4763]: I1201 10:41:11.383135 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-mjbm5_8c962037-246b-4727-8aab-6632e2e9e5f7/cert-manager-webhook/0.log" Dec 01 10:41:24 crc kubenswrapper[4763]: I1201 10:41:24.525832 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-rsnnh_757fd525-a1b2-45c0-a3eb-7b8c3d6423d3/nmstate-console-plugin/0.log" Dec 01 10:41:25 crc kubenswrapper[4763]: I1201 10:41:25.102956 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-v2v9v_6789e53e-89e7-4593-a298-4b9eb0e0cf24/nmstate-metrics/0.log" Dec 01 10:41:25 crc kubenswrapper[4763]: I1201 10:41:25.106802 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-h8q8q_1e79baf9-ce6c-4a92-891f-54eba3049168/nmstate-handler/0.log" Dec 01 10:41:25 crc kubenswrapper[4763]: I1201 10:41:25.153228 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-v2v9v_6789e53e-89e7-4593-a298-4b9eb0e0cf24/kube-rbac-proxy/0.log" Dec 01 10:41:25 crc kubenswrapper[4763]: I1201 10:41:25.332848 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-5vb8f_8b28a543-e1fd-4862-af9e-b4c77a652700/nmstate-operator/0.log" Dec 01 10:41:25 crc kubenswrapper[4763]: I1201 10:41:25.421725 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-z2gjv_31ef1fc8-adab-4f75-bac1-e8ff859eb425/nmstate-webhook/0.log" Dec 01 10:41:33 crc kubenswrapper[4763]: I1201 10:41:33.929414 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:41:33 crc kubenswrapper[4763]: I1201 10:41:33.930009 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.195544 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-68xth_39d8a539-ae28-40cb-b850-d40b3cc839b8/controller/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.248120 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-68xth_39d8a539-ae28-40cb-b850-d40b3cc839b8/kube-rbac-proxy/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.449467 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-frr-files/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.646126 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-reloader/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.685862 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-frr-files/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.708992 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-metrics/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.736746 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-reloader/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.985949 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-frr-files/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.989043 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-reloader/0.log" Dec 01 10:41:40 crc kubenswrapper[4763]: I1201 10:41:40.989714 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-metrics/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.119484 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-metrics/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.267625 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-reloader/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.268975 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-frr-files/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.302358 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/cp-metrics/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.337168 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/controller/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.470133 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/kube-rbac-proxy/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.523282 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/frr-metrics/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.658119 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/kube-rbac-proxy-frr/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.895827 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/reloader/0.log" Dec 01 10:41:41 crc kubenswrapper[4763]: I1201 10:41:41.920127 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-6kd8p_e1113818-415e-494e-8979-9de8da7db507/frr-k8s-webhook-server/0.log" Dec 01 10:41:42 crc kubenswrapper[4763]: I1201 10:41:42.181166 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7488746df5-gj8c5_e3849fc4-ce1d-43c8-b53e-189fa5fd4aa5/manager/0.log" Dec 01 10:41:42 crc kubenswrapper[4763]: I1201 10:41:42.417850 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ljjmd_bd336d9e-af02-4fc6-ae34-147c379ba374/kube-rbac-proxy/0.log" Dec 01 10:41:42 crc kubenswrapper[4763]: I1201 10:41:42.418651 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c7c865bc4-5b725_eeaeb040-1a35-4174-9c9c-7ffe226a79e5/webhook-server/0.log" Dec 01 10:41:43 crc kubenswrapper[4763]: I1201 10:41:43.201714 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ljjmd_bd336d9e-af02-4fc6-ae34-147c379ba374/speaker/0.log" Dec 01 10:41:43 crc kubenswrapper[4763]: I1201 10:41:43.240407 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cg9jb_9a10b30c-69e2-4037-bc91-dfd5191a6e72/frr/0.log" Dec 01 10:41:55 crc kubenswrapper[4763]: I1201 10:41:55.587736 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/util/0.log" Dec 01 10:41:55 crc kubenswrapper[4763]: I1201 10:41:55.731474 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/util/0.log" Dec 01 10:41:55 crc kubenswrapper[4763]: I1201 10:41:55.779336 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/pull/0.log" Dec 01 10:41:55 crc kubenswrapper[4763]: I1201 10:41:55.855554 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/pull/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.045969 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/util/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.046905 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/pull/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.133380 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7lz9n_427c5d0e-a085-4795-9df8-47584898bc8c/extract/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.264428 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/util/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.429994 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/util/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.431054 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/pull/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.515002 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/pull/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.710815 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/extract/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.769097 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/util/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.802537 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nfvsv_fc7608a8-f3a1-46e6-9aa2-2331d7b4cf94/pull/0.log" Dec 01 10:41:56 crc kubenswrapper[4763]: I1201 10:41:56.913983 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-utilities/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.147942 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-utilities/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.199309 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-content/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.201819 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-content/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.303989 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-utilities/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.332316 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/extract-content/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.531907 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-utilities/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.895244 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-utilities/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.915028 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-content/0.log" Dec 01 10:41:57 crc kubenswrapper[4763]: I1201 10:41:57.977126 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-content/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.152167 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hc4lf_a09b38c8-91a6-45ed-b97b-d0370e99ab11/registry-server/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.210901 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-content/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.211702 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/extract-utilities/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.565073 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7mwbs_8ed98359-8184-409c-9f5d-f2b2b21b9cb7/marketplace-operator/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.596981 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-vjr8n_26163ccc-7fc4-4baa-9bf0-7ca523c888ea/registry-server/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.733699 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-utilities/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.914618 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-content/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.928609 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-content/0.log" Dec 01 10:41:58 crc kubenswrapper[4763]: I1201 10:41:58.971127 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-utilities/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.139014 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-content/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.182022 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/extract-utilities/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.386739 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xwfqr_52470a23-807a-48bd-968d-eb43cb36b804/registry-server/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.401298 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-utilities/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.592671 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-content/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.607875 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-utilities/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.635513 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-content/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.804065 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-utilities/0.log" Dec 01 10:41:59 crc kubenswrapper[4763]: I1201 10:41:59.810194 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/extract-content/0.log" Dec 01 10:42:00 crc kubenswrapper[4763]: I1201 10:42:00.093477 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fsnqs_678966e0-053a-40f9-b29f-84b8ab6dbc83/registry-server/0.log" Dec 01 10:42:03 crc kubenswrapper[4763]: I1201 10:42:03.929279 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:42:03 crc kubenswrapper[4763]: I1201 10:42:03.930071 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.006132 4763 patch_prober.go:28] interesting pod/machine-config-daemon-l5kgb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.008190 4763 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.008236 4763 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.009228 4763 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f"} pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.009300 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerName="machine-config-daemon" containerID="cri-o://9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" gracePeriod=600 Dec 01 10:42:34 crc kubenswrapper[4763]: E1201 10:42:34.139939 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.534399 4763 generic.go:334] "Generic (PLEG): container finished" podID="f95ef452-7057-4afb-a8ca-1c505b953c2e" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" exitCode=0 Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.534602 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerDied","Data":"9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f"} Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.534783 4763 scope.go:117] "RemoveContainer" containerID="b11377138eb89562b1cf83f3e428880f88f84a0f6bac12f8f723c4c8b2960f61" Dec 01 10:42:34 crc kubenswrapper[4763]: I1201 10:42:34.535444 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:42:34 crc kubenswrapper[4763]: E1201 10:42:34.535895 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:42:46 crc kubenswrapper[4763]: I1201 10:42:46.995380 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:42:46 crc kubenswrapper[4763]: E1201 10:42:46.996294 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:43:00 crc kubenswrapper[4763]: I1201 10:43:00.993766 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:43:00 crc kubenswrapper[4763]: E1201 10:43:00.994382 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:43:11 crc kubenswrapper[4763]: I1201 10:43:11.994328 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:43:11 crc kubenswrapper[4763]: E1201 10:43:11.995193 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:43:24 crc kubenswrapper[4763]: I1201 10:43:24.994539 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:43:24 crc kubenswrapper[4763]: E1201 10:43:24.996677 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:43:36 crc kubenswrapper[4763]: I1201 10:43:36.994778 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:43:36 crc kubenswrapper[4763]: E1201 10:43:36.995635 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:43:51 crc kubenswrapper[4763]: I1201 10:43:51.994749 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:43:51 crc kubenswrapper[4763]: E1201 10:43:51.996606 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:44:06 crc kubenswrapper[4763]: I1201 10:44:06.994207 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:44:06 crc kubenswrapper[4763]: E1201 10:44:06.995243 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:44:10 crc kubenswrapper[4763]: I1201 10:44:10.839387 4763 scope.go:117] "RemoveContainer" containerID="8e6e073ca9c2bdb7b148726f4527104504674241328104ce68cea114a22eb8f9" Dec 01 10:44:20 crc kubenswrapper[4763]: I1201 10:44:20.994164 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:44:20 crc kubenswrapper[4763]: E1201 10:44:20.995197 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:44:24 crc kubenswrapper[4763]: I1201 10:44:24.769845 4763 generic.go:334] "Generic (PLEG): container finished" podID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerID="adc12370e74f36eead0078fc9bbf84d8451aadfed6bf8b58c758509e620da6e6" exitCode=0 Dec 01 10:44:24 crc kubenswrapper[4763]: I1201 10:44:24.769913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n26h2/must-gather-wdhjr" event={"ID":"58a2af15-6ccd-439b-9fac-039e0f6a9342","Type":"ContainerDied","Data":"adc12370e74f36eead0078fc9bbf84d8451aadfed6bf8b58c758509e620da6e6"} Dec 01 10:44:24 crc kubenswrapper[4763]: I1201 10:44:24.771004 4763 scope.go:117] "RemoveContainer" containerID="adc12370e74f36eead0078fc9bbf84d8451aadfed6bf8b58c758509e620da6e6" Dec 01 10:44:25 crc kubenswrapper[4763]: I1201 10:44:25.378555 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n26h2_must-gather-wdhjr_58a2af15-6ccd-439b-9fac-039e0f6a9342/gather/0.log" Dec 01 10:44:33 crc kubenswrapper[4763]: I1201 10:44:33.995344 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:44:33 crc kubenswrapper[4763]: E1201 10:44:33.996607 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:44:39 crc kubenswrapper[4763]: I1201 10:44:39.628746 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n26h2/must-gather-wdhjr"] Dec 01 10:44:39 crc kubenswrapper[4763]: I1201 10:44:39.629633 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n26h2/must-gather-wdhjr" podUID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerName="copy" containerID="cri-o://afb94e44f36ae2c089a040cf16a2e3e107aadde89d750f57ac881d77c0af6033" gracePeriod=2 Dec 01 10:44:39 crc kubenswrapper[4763]: I1201 10:44:39.655623 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n26h2/must-gather-wdhjr"] Dec 01 10:44:39 crc kubenswrapper[4763]: I1201 10:44:39.929545 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n26h2_must-gather-wdhjr_58a2af15-6ccd-439b-9fac-039e0f6a9342/copy/0.log" Dec 01 10:44:39 crc kubenswrapper[4763]: I1201 10:44:39.930605 4763 generic.go:334] "Generic (PLEG): container finished" podID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerID="afb94e44f36ae2c089a040cf16a2e3e107aadde89d750f57ac881d77c0af6033" exitCode=143 Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.104102 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n26h2_must-gather-wdhjr_58a2af15-6ccd-439b-9fac-039e0f6a9342/copy/0.log" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.104487 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.290537 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58a2af15-6ccd-439b-9fac-039e0f6a9342-must-gather-output\") pod \"58a2af15-6ccd-439b-9fac-039e0f6a9342\" (UID: \"58a2af15-6ccd-439b-9fac-039e0f6a9342\") " Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.290868 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4bjd\" (UniqueName: \"kubernetes.io/projected/58a2af15-6ccd-439b-9fac-039e0f6a9342-kube-api-access-v4bjd\") pod \"58a2af15-6ccd-439b-9fac-039e0f6a9342\" (UID: \"58a2af15-6ccd-439b-9fac-039e0f6a9342\") " Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.296666 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a2af15-6ccd-439b-9fac-039e0f6a9342-kube-api-access-v4bjd" (OuterVolumeSpecName: "kube-api-access-v4bjd") pod "58a2af15-6ccd-439b-9fac-039e0f6a9342" (UID: "58a2af15-6ccd-439b-9fac-039e0f6a9342"). InnerVolumeSpecName "kube-api-access-v4bjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.394374 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4bjd\" (UniqueName: \"kubernetes.io/projected/58a2af15-6ccd-439b-9fac-039e0f6a9342-kube-api-access-v4bjd\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.498175 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58a2af15-6ccd-439b-9fac-039e0f6a9342-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "58a2af15-6ccd-439b-9fac-039e0f6a9342" (UID: "58a2af15-6ccd-439b-9fac-039e0f6a9342"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.598223 4763 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58a2af15-6ccd-439b-9fac-039e0f6a9342-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.946966 4763 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n26h2_must-gather-wdhjr_58a2af15-6ccd-439b-9fac-039e0f6a9342/copy/0.log" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.947443 4763 scope.go:117] "RemoveContainer" containerID="afb94e44f36ae2c089a040cf16a2e3e107aadde89d750f57ac881d77c0af6033" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.947595 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n26h2/must-gather-wdhjr" Dec 01 10:44:40 crc kubenswrapper[4763]: I1201 10:44:40.984047 4763 scope.go:117] "RemoveContainer" containerID="adc12370e74f36eead0078fc9bbf84d8451aadfed6bf8b58c758509e620da6e6" Dec 01 10:44:41 crc kubenswrapper[4763]: I1201 10:44:41.021652 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a2af15-6ccd-439b-9fac-039e0f6a9342" path="/var/lib/kubelet/pods/58a2af15-6ccd-439b-9fac-039e0f6a9342/volumes" Dec 01 10:44:45 crc kubenswrapper[4763]: I1201 10:44:45.993855 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:44:45 crc kubenswrapper[4763]: E1201 10:44:45.994536 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:44:57 crc kubenswrapper[4763]: I1201 10:44:57.995094 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:44:57 crc kubenswrapper[4763]: E1201 10:44:57.995833 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.197089 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p"] Dec 01 10:45:00 crc kubenswrapper[4763]: E1201 10:45:00.199150 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8640634b-a87c-401b-aac0-11e53eb82be4" containerName="container-00" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.199179 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="8640634b-a87c-401b-aac0-11e53eb82be4" containerName="container-00" Dec 01 10:45:00 crc kubenswrapper[4763]: E1201 10:45:00.199214 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerName="gather" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.199224 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerName="gather" Dec 01 10:45:00 crc kubenswrapper[4763]: E1201 10:45:00.199242 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerName="copy" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.199248 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerName="copy" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.199622 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerName="gather" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.199654 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a2af15-6ccd-439b-9fac-039e0f6a9342" containerName="copy" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.199676 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="8640634b-a87c-401b-aac0-11e53eb82be4" containerName="container-00" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.200759 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.202803 4763 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.203483 4763 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.208204 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p"] Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.343515 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6a45c87-4dcb-4156-b4ad-38df706334fe-config-volume\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.343617 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfzb\" (UniqueName: \"kubernetes.io/projected/e6a45c87-4dcb-4156-b4ad-38df706334fe-kube-api-access-2xfzb\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.343654 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6a45c87-4dcb-4156-b4ad-38df706334fe-secret-volume\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.445279 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6a45c87-4dcb-4156-b4ad-38df706334fe-config-volume\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.445393 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfzb\" (UniqueName: \"kubernetes.io/projected/e6a45c87-4dcb-4156-b4ad-38df706334fe-kube-api-access-2xfzb\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.445428 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6a45c87-4dcb-4156-b4ad-38df706334fe-secret-volume\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.446547 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6a45c87-4dcb-4156-b4ad-38df706334fe-config-volume\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.459321 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6a45c87-4dcb-4156-b4ad-38df706334fe-secret-volume\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.470396 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfzb\" (UniqueName: \"kubernetes.io/projected/e6a45c87-4dcb-4156-b4ad-38df706334fe-kube-api-access-2xfzb\") pod \"collect-profiles-29409765-t585p\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:00 crc kubenswrapper[4763]: I1201 10:45:00.529910 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:01 crc kubenswrapper[4763]: I1201 10:45:01.063990 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p"] Dec 01 10:45:01 crc kubenswrapper[4763]: I1201 10:45:01.180075 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" event={"ID":"e6a45c87-4dcb-4156-b4ad-38df706334fe","Type":"ContainerStarted","Data":"333651fae3c74e47d0b5ed176135c244fc5041e751aec9bece354c4a5cf95d1c"} Dec 01 10:45:02 crc kubenswrapper[4763]: I1201 10:45:02.194921 4763 generic.go:334] "Generic (PLEG): container finished" podID="e6a45c87-4dcb-4156-b4ad-38df706334fe" containerID="e7d19bdfcf38de52571e6fff04f15a64544c64009cce6706ee9855651df7ab2c" exitCode=0 Dec 01 10:45:02 crc kubenswrapper[4763]: I1201 10:45:02.195021 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" event={"ID":"e6a45c87-4dcb-4156-b4ad-38df706334fe","Type":"ContainerDied","Data":"e7d19bdfcf38de52571e6fff04f15a64544c64009cce6706ee9855651df7ab2c"} Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.569774 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.724559 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6a45c87-4dcb-4156-b4ad-38df706334fe-config-volume\") pod \"e6a45c87-4dcb-4156-b4ad-38df706334fe\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.724602 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6a45c87-4dcb-4156-b4ad-38df706334fe-secret-volume\") pod \"e6a45c87-4dcb-4156-b4ad-38df706334fe\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.724633 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xfzb\" (UniqueName: \"kubernetes.io/projected/e6a45c87-4dcb-4156-b4ad-38df706334fe-kube-api-access-2xfzb\") pod \"e6a45c87-4dcb-4156-b4ad-38df706334fe\" (UID: \"e6a45c87-4dcb-4156-b4ad-38df706334fe\") " Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.725126 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6a45c87-4dcb-4156-b4ad-38df706334fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "e6a45c87-4dcb-4156-b4ad-38df706334fe" (UID: "e6a45c87-4dcb-4156-b4ad-38df706334fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.730413 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a45c87-4dcb-4156-b4ad-38df706334fe-kube-api-access-2xfzb" (OuterVolumeSpecName: "kube-api-access-2xfzb") pod "e6a45c87-4dcb-4156-b4ad-38df706334fe" (UID: "e6a45c87-4dcb-4156-b4ad-38df706334fe"). InnerVolumeSpecName "kube-api-access-2xfzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.732583 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a45c87-4dcb-4156-b4ad-38df706334fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e6a45c87-4dcb-4156-b4ad-38df706334fe" (UID: "e6a45c87-4dcb-4156-b4ad-38df706334fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.827340 4763 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6a45c87-4dcb-4156-b4ad-38df706334fe-config-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.827378 4763 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6a45c87-4dcb-4156-b4ad-38df706334fe-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:03 crc kubenswrapper[4763]: I1201 10:45:03.827388 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xfzb\" (UniqueName: \"kubernetes.io/projected/e6a45c87-4dcb-4156-b4ad-38df706334fe-kube-api-access-2xfzb\") on node \"crc\" DevicePath \"\"" Dec 01 10:45:04 crc kubenswrapper[4763]: I1201 10:45:04.218996 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" event={"ID":"e6a45c87-4dcb-4156-b4ad-38df706334fe","Type":"ContainerDied","Data":"333651fae3c74e47d0b5ed176135c244fc5041e751aec9bece354c4a5cf95d1c"} Dec 01 10:45:04 crc kubenswrapper[4763]: I1201 10:45:04.219754 4763 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="333651fae3c74e47d0b5ed176135c244fc5041e751aec9bece354c4a5cf95d1c" Dec 01 10:45:04 crc kubenswrapper[4763]: I1201 10:45:04.219259 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29409765-t585p" Dec 01 10:45:04 crc kubenswrapper[4763]: I1201 10:45:04.698491 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6"] Dec 01 10:45:04 crc kubenswrapper[4763]: I1201 10:45:04.708435 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29409720-qgdm6"] Dec 01 10:45:05 crc kubenswrapper[4763]: I1201 10:45:05.007684 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304258b8-4b3c-4ec7-833e-ecd8c317f947" path="/var/lib/kubelet/pods/304258b8-4b3c-4ec7-833e-ecd8c317f947/volumes" Dec 01 10:45:10 crc kubenswrapper[4763]: I1201 10:45:10.891636 4763 scope.go:117] "RemoveContainer" containerID="a247fcd89d16c5676fc6f851d345fc41d3cb0f5447d4d6980636fae1d7ee910e" Dec 01 10:45:13 crc kubenswrapper[4763]: I1201 10:45:13.018678 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:45:13 crc kubenswrapper[4763]: E1201 10:45:13.026176 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:45:24 crc kubenswrapper[4763]: I1201 10:45:24.994563 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:45:24 crc kubenswrapper[4763]: E1201 10:45:24.995587 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:45:38 crc kubenswrapper[4763]: I1201 10:45:38.996943 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:45:38 crc kubenswrapper[4763]: E1201 10:45:38.997870 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.362327 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bn8b7"] Dec 01 10:45:43 crc kubenswrapper[4763]: E1201 10:45:43.363560 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a45c87-4dcb-4156-b4ad-38df706334fe" containerName="collect-profiles" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.363582 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a45c87-4dcb-4156-b4ad-38df706334fe" containerName="collect-profiles" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.363970 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a45c87-4dcb-4156-b4ad-38df706334fe" containerName="collect-profiles" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.366962 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.384955 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn8b7"] Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.549176 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-utilities\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.549564 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4gf\" (UniqueName: \"kubernetes.io/projected/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-kube-api-access-9t4gf\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.549760 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-catalog-content\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.651381 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4gf\" (UniqueName: \"kubernetes.io/projected/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-kube-api-access-9t4gf\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.651504 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-catalog-content\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.651540 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-utilities\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.652003 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-catalog-content\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.652038 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-utilities\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.677429 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4gf\" (UniqueName: \"kubernetes.io/projected/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-kube-api-access-9t4gf\") pod \"redhat-operators-bn8b7\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:43 crc kubenswrapper[4763]: I1201 10:45:43.698488 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:44 crc kubenswrapper[4763]: I1201 10:45:44.196877 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bn8b7"] Dec 01 10:45:44 crc kubenswrapper[4763]: I1201 10:45:44.618368 4763 generic.go:334] "Generic (PLEG): container finished" podID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerID="d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17" exitCode=0 Dec 01 10:45:44 crc kubenswrapper[4763]: I1201 10:45:44.618672 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8b7" event={"ID":"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99","Type":"ContainerDied","Data":"d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17"} Dec 01 10:45:44 crc kubenswrapper[4763]: I1201 10:45:44.618700 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8b7" event={"ID":"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99","Type":"ContainerStarted","Data":"ae5ae4b59801622b623676c1a55fe1533331ba88f3823f38463fb71fefbfeb7e"} Dec 01 10:45:44 crc kubenswrapper[4763]: I1201 10:45:44.621061 4763 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 01 10:45:45 crc kubenswrapper[4763]: I1201 10:45:45.633314 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8b7" event={"ID":"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99","Type":"ContainerStarted","Data":"eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114"} Dec 01 10:45:48 crc kubenswrapper[4763]: I1201 10:45:48.665434 4763 generic.go:334] "Generic (PLEG): container finished" podID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerID="eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114" exitCode=0 Dec 01 10:45:48 crc kubenswrapper[4763]: I1201 10:45:48.665555 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8b7" event={"ID":"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99","Type":"ContainerDied","Data":"eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114"} Dec 01 10:45:49 crc kubenswrapper[4763]: I1201 10:45:49.680743 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8b7" event={"ID":"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99","Type":"ContainerStarted","Data":"429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8"} Dec 01 10:45:49 crc kubenswrapper[4763]: I1201 10:45:49.722567 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bn8b7" podStartSLOduration=2.2276802079999998 podStartE2EDuration="6.722538479s" podCreationTimestamp="2025-12-01 10:45:43 +0000 UTC" firstStartedPulling="2025-12-01 10:45:44.620882513 +0000 UTC m=+5461.889531281" lastFinishedPulling="2025-12-01 10:45:49.115740784 +0000 UTC m=+5466.384389552" observedRunningTime="2025-12-01 10:45:49.704201081 +0000 UTC m=+5466.972849849" watchObservedRunningTime="2025-12-01 10:45:49.722538479 +0000 UTC m=+5466.991187287" Dec 01 10:45:51 crc kubenswrapper[4763]: I1201 10:45:51.995425 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:45:51 crc kubenswrapper[4763]: E1201 10:45:51.996066 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:45:53 crc kubenswrapper[4763]: I1201 10:45:53.699131 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:53 crc kubenswrapper[4763]: I1201 10:45:53.700383 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:45:54 crc kubenswrapper[4763]: I1201 10:45:54.754131 4763 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bn8b7" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="registry-server" probeResult="failure" output=< Dec 01 10:45:54 crc kubenswrapper[4763]: timeout: failed to connect service ":50051" within 1s Dec 01 10:45:54 crc kubenswrapper[4763]: > Dec 01 10:46:03 crc kubenswrapper[4763]: I1201 10:46:03.775443 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:46:03 crc kubenswrapper[4763]: I1201 10:46:03.856082 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:46:03 crc kubenswrapper[4763]: I1201 10:46:03.994596 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:46:03 crc kubenswrapper[4763]: E1201 10:46:03.994948 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:46:04 crc kubenswrapper[4763]: I1201 10:46:04.013395 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bn8b7"] Dec 01 10:46:04 crc kubenswrapper[4763]: I1201 10:46:04.843721 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bn8b7" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="registry-server" containerID="cri-o://429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8" gracePeriod=2 Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.305941 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.451276 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-catalog-content\") pod \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.451351 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t4gf\" (UniqueName: \"kubernetes.io/projected/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-kube-api-access-9t4gf\") pod \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.451398 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-utilities\") pod \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\" (UID: \"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99\") " Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.453408 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-utilities" (OuterVolumeSpecName: "utilities") pod "d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" (UID: "d2ae1cbd-0348-407f-b29c-5f9f3fae0d99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.466493 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-kube-api-access-9t4gf" (OuterVolumeSpecName: "kube-api-access-9t4gf") pod "d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" (UID: "d2ae1cbd-0348-407f-b29c-5f9f3fae0d99"). InnerVolumeSpecName "kube-api-access-9t4gf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.555483 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t4gf\" (UniqueName: \"kubernetes.io/projected/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-kube-api-access-9t4gf\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.555743 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.570118 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" (UID: "d2ae1cbd-0348-407f-b29c-5f9f3fae0d99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.657868 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.858719 4763 generic.go:334] "Generic (PLEG): container finished" podID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerID="429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8" exitCode=0 Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.858823 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8b7" event={"ID":"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99","Type":"ContainerDied","Data":"429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8"} Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.858903 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bn8b7" event={"ID":"d2ae1cbd-0348-407f-b29c-5f9f3fae0d99","Type":"ContainerDied","Data":"ae5ae4b59801622b623676c1a55fe1533331ba88f3823f38463fb71fefbfeb7e"} Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.858938 4763 scope.go:117] "RemoveContainer" containerID="429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.858981 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bn8b7" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.904047 4763 scope.go:117] "RemoveContainer" containerID="eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.922397 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bn8b7"] Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.929215 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bn8b7"] Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.936088 4763 scope.go:117] "RemoveContainer" containerID="d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.973940 4763 scope.go:117] "RemoveContainer" containerID="429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8" Dec 01 10:46:05 crc kubenswrapper[4763]: E1201 10:46:05.974431 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8\": container with ID starting with 429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8 not found: ID does not exist" containerID="429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.974477 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8"} err="failed to get container status \"429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8\": rpc error: code = NotFound desc = could not find container \"429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8\": container with ID starting with 429fb28f5be7d5778a3bb0d875b8eefe251a5afa79f797eefe73e569890e9dd8 not found: ID does not exist" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.974498 4763 scope.go:117] "RemoveContainer" containerID="eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114" Dec 01 10:46:05 crc kubenswrapper[4763]: E1201 10:46:05.974771 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114\": container with ID starting with eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114 not found: ID does not exist" containerID="eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.974793 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114"} err="failed to get container status \"eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114\": rpc error: code = NotFound desc = could not find container \"eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114\": container with ID starting with eec00022ab83e31ad3e408fa744a378ff1f78325a676ae2b3d8d91176c59e114 not found: ID does not exist" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.974805 4763 scope.go:117] "RemoveContainer" containerID="d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17" Dec 01 10:46:05 crc kubenswrapper[4763]: E1201 10:46:05.975029 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17\": container with ID starting with d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17 not found: ID does not exist" containerID="d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17" Dec 01 10:46:05 crc kubenswrapper[4763]: I1201 10:46:05.975050 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17"} err="failed to get container status \"d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17\": rpc error: code = NotFound desc = could not find container \"d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17\": container with ID starting with d0284cb30146557c746b80ff7bede69cf9b269d68d0ece3dea38c3ef2160be17 not found: ID does not exist" Dec 01 10:46:07 crc kubenswrapper[4763]: I1201 10:46:07.009841 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" path="/var/lib/kubelet/pods/d2ae1cbd-0348-407f-b29c-5f9f3fae0d99/volumes" Dec 01 10:46:18 crc kubenswrapper[4763]: I1201 10:46:18.874264 4763 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kthz8"] Dec 01 10:46:18 crc kubenswrapper[4763]: E1201 10:46:18.875560 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="registry-server" Dec 01 10:46:18 crc kubenswrapper[4763]: I1201 10:46:18.875585 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="registry-server" Dec 01 10:46:18 crc kubenswrapper[4763]: E1201 10:46:18.875610 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="extract-utilities" Dec 01 10:46:18 crc kubenswrapper[4763]: I1201 10:46:18.875619 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="extract-utilities" Dec 01 10:46:18 crc kubenswrapper[4763]: E1201 10:46:18.875636 4763 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="extract-content" Dec 01 10:46:18 crc kubenswrapper[4763]: I1201 10:46:18.875644 4763 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="extract-content" Dec 01 10:46:18 crc kubenswrapper[4763]: I1201 10:46:18.875925 4763 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ae1cbd-0348-407f-b29c-5f9f3fae0d99" containerName="registry-server" Dec 01 10:46:18 crc kubenswrapper[4763]: I1201 10:46:18.878294 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:18 crc kubenswrapper[4763]: I1201 10:46:18.910419 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kthz8"] Dec 01 10:46:18 crc kubenswrapper[4763]: I1201 10:46:18.994537 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:46:18 crc kubenswrapper[4763]: E1201 10:46:18.994778 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.006787 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj2p4\" (UniqueName: \"kubernetes.io/projected/00d955f4-c511-4979-af85-7e42dcecc7e7-kube-api-access-xj2p4\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.007159 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-catalog-content\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.007222 4763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-utilities\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.108633 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj2p4\" (UniqueName: \"kubernetes.io/projected/00d955f4-c511-4979-af85-7e42dcecc7e7-kube-api-access-xj2p4\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.108696 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-catalog-content\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.108763 4763 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-utilities\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.109659 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-utilities\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.109685 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-catalog-content\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.134284 4763 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj2p4\" (UniqueName: \"kubernetes.io/projected/00d955f4-c511-4979-af85-7e42dcecc7e7-kube-api-access-xj2p4\") pod \"community-operators-kthz8\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.208815 4763 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:19 crc kubenswrapper[4763]: I1201 10:46:19.750323 4763 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kthz8"] Dec 01 10:46:20 crc kubenswrapper[4763]: I1201 10:46:20.126996 4763 generic.go:334] "Generic (PLEG): container finished" podID="00d955f4-c511-4979-af85-7e42dcecc7e7" containerID="897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39" exitCode=0 Dec 01 10:46:20 crc kubenswrapper[4763]: I1201 10:46:20.127029 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthz8" event={"ID":"00d955f4-c511-4979-af85-7e42dcecc7e7","Type":"ContainerDied","Data":"897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39"} Dec 01 10:46:20 crc kubenswrapper[4763]: I1201 10:46:20.127293 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthz8" event={"ID":"00d955f4-c511-4979-af85-7e42dcecc7e7","Type":"ContainerStarted","Data":"6d54d3827ec6c5f25b6505b9a644eda6c6bda9f2119820db63ed2c2e6b8b87b2"} Dec 01 10:46:21 crc kubenswrapper[4763]: I1201 10:46:21.144306 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthz8" event={"ID":"00d955f4-c511-4979-af85-7e42dcecc7e7","Type":"ContainerStarted","Data":"1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590"} Dec 01 10:46:22 crc kubenswrapper[4763]: I1201 10:46:22.156914 4763 generic.go:334] "Generic (PLEG): container finished" podID="00d955f4-c511-4979-af85-7e42dcecc7e7" containerID="1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590" exitCode=0 Dec 01 10:46:22 crc kubenswrapper[4763]: I1201 10:46:22.156982 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthz8" event={"ID":"00d955f4-c511-4979-af85-7e42dcecc7e7","Type":"ContainerDied","Data":"1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590"} Dec 01 10:46:23 crc kubenswrapper[4763]: I1201 10:46:23.168490 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthz8" event={"ID":"00d955f4-c511-4979-af85-7e42dcecc7e7","Type":"ContainerStarted","Data":"65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78"} Dec 01 10:46:23 crc kubenswrapper[4763]: I1201 10:46:23.190575 4763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kthz8" podStartSLOduration=2.53270901 podStartE2EDuration="5.190556722s" podCreationTimestamp="2025-12-01 10:46:18 +0000 UTC" firstStartedPulling="2025-12-01 10:46:20.128314117 +0000 UTC m=+5497.396962875" lastFinishedPulling="2025-12-01 10:46:22.786161779 +0000 UTC m=+5500.054810587" observedRunningTime="2025-12-01 10:46:23.181531507 +0000 UTC m=+5500.450180275" watchObservedRunningTime="2025-12-01 10:46:23.190556722 +0000 UTC m=+5500.459205490" Dec 01 10:46:29 crc kubenswrapper[4763]: I1201 10:46:29.209615 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:29 crc kubenswrapper[4763]: I1201 10:46:29.210366 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:29 crc kubenswrapper[4763]: I1201 10:46:29.288207 4763 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:29 crc kubenswrapper[4763]: I1201 10:46:29.373064 4763 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:29 crc kubenswrapper[4763]: I1201 10:46:29.531777 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kthz8"] Dec 01 10:46:31 crc kubenswrapper[4763]: I1201 10:46:31.261175 4763 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kthz8" podUID="00d955f4-c511-4979-af85-7e42dcecc7e7" containerName="registry-server" containerID="cri-o://65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78" gracePeriod=2 Dec 01 10:46:31 crc kubenswrapper[4763]: I1201 10:46:31.964976 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.012328 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj2p4\" (UniqueName: \"kubernetes.io/projected/00d955f4-c511-4979-af85-7e42dcecc7e7-kube-api-access-xj2p4\") pod \"00d955f4-c511-4979-af85-7e42dcecc7e7\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.012837 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-utilities\") pod \"00d955f4-c511-4979-af85-7e42dcecc7e7\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.013027 4763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-catalog-content\") pod \"00d955f4-c511-4979-af85-7e42dcecc7e7\" (UID: \"00d955f4-c511-4979-af85-7e42dcecc7e7\") " Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.019359 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d955f4-c511-4979-af85-7e42dcecc7e7-kube-api-access-xj2p4" (OuterVolumeSpecName: "kube-api-access-xj2p4") pod "00d955f4-c511-4979-af85-7e42dcecc7e7" (UID: "00d955f4-c511-4979-af85-7e42dcecc7e7"). InnerVolumeSpecName "kube-api-access-xj2p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.025152 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-utilities" (OuterVolumeSpecName: "utilities") pod "00d955f4-c511-4979-af85-7e42dcecc7e7" (UID: "00d955f4-c511-4979-af85-7e42dcecc7e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.074528 4763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00d955f4-c511-4979-af85-7e42dcecc7e7" (UID: "00d955f4-c511-4979-af85-7e42dcecc7e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.117799 4763 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.117870 4763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj2p4\" (UniqueName: \"kubernetes.io/projected/00d955f4-c511-4979-af85-7e42dcecc7e7-kube-api-access-xj2p4\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.117892 4763 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00d955f4-c511-4979-af85-7e42dcecc7e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.272664 4763 generic.go:334] "Generic (PLEG): container finished" podID="00d955f4-c511-4979-af85-7e42dcecc7e7" containerID="65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78" exitCode=0 Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.272754 4763 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kthz8" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.272747 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthz8" event={"ID":"00d955f4-c511-4979-af85-7e42dcecc7e7","Type":"ContainerDied","Data":"65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78"} Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.273710 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kthz8" event={"ID":"00d955f4-c511-4979-af85-7e42dcecc7e7","Type":"ContainerDied","Data":"6d54d3827ec6c5f25b6505b9a644eda6c6bda9f2119820db63ed2c2e6b8b87b2"} Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.273760 4763 scope.go:117] "RemoveContainer" containerID="65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.303189 4763 scope.go:117] "RemoveContainer" containerID="1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.340674 4763 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kthz8"] Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.340778 4763 scope.go:117] "RemoveContainer" containerID="897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.355515 4763 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kthz8"] Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.393962 4763 scope.go:117] "RemoveContainer" containerID="65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78" Dec 01 10:46:32 crc kubenswrapper[4763]: E1201 10:46:32.394597 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78\": container with ID starting with 65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78 not found: ID does not exist" containerID="65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.394667 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78"} err="failed to get container status \"65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78\": rpc error: code = NotFound desc = could not find container \"65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78\": container with ID starting with 65de1d12e1a44cf7a95b753111248f4490e44ce46e207c4b5eea49590408ea78 not found: ID does not exist" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.394706 4763 scope.go:117] "RemoveContainer" containerID="1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590" Dec 01 10:46:32 crc kubenswrapper[4763]: E1201 10:46:32.395302 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590\": container with ID starting with 1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590 not found: ID does not exist" containerID="1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.395353 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590"} err="failed to get container status \"1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590\": rpc error: code = NotFound desc = could not find container \"1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590\": container with ID starting with 1e09b092a2803951554df415b7411e5e71a0e9e8a2c651f7a6839d8879170590 not found: ID does not exist" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.395383 4763 scope.go:117] "RemoveContainer" containerID="897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39" Dec 01 10:46:32 crc kubenswrapper[4763]: E1201 10:46:32.395938 4763 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39\": container with ID starting with 897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39 not found: ID does not exist" containerID="897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39" Dec 01 10:46:32 crc kubenswrapper[4763]: I1201 10:46:32.395984 4763 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39"} err="failed to get container status \"897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39\": rpc error: code = NotFound desc = could not find container \"897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39\": container with ID starting with 897ee1d4a95ebbda9bb266d8ad98cce70223f1cff1d35bc65c17fbd8caf83c39 not found: ID does not exist" Dec 01 10:46:33 crc kubenswrapper[4763]: I1201 10:46:33.009646 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:46:33 crc kubenswrapper[4763]: E1201 10:46:33.010157 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:46:33 crc kubenswrapper[4763]: I1201 10:46:33.015003 4763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d955f4-c511-4979-af85-7e42dcecc7e7" path="/var/lib/kubelet/pods/00d955f4-c511-4979-af85-7e42dcecc7e7/volumes" Dec 01 10:46:45 crc kubenswrapper[4763]: I1201 10:46:45.994952 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:46:45 crc kubenswrapper[4763]: E1201 10:46:45.995750 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:47:00 crc kubenswrapper[4763]: I1201 10:47:00.994569 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:47:00 crc kubenswrapper[4763]: E1201 10:47:00.995322 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:47:15 crc kubenswrapper[4763]: I1201 10:47:15.993758 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:47:15 crc kubenswrapper[4763]: E1201 10:47:15.994482 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:47:27 crc kubenswrapper[4763]: I1201 10:47:27.994100 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:47:27 crc kubenswrapper[4763]: E1201 10:47:27.996005 4763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l5kgb_openshift-machine-config-operator(f95ef452-7057-4afb-a8ca-1c505b953c2e)\"" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" podUID="f95ef452-7057-4afb-a8ca-1c505b953c2e" Dec 01 10:47:43 crc kubenswrapper[4763]: I1201 10:47:43.025065 4763 scope.go:117] "RemoveContainer" containerID="9690c003f2b241cf6936c4d0186fc7c352df5f7c85a96210d5f372f4228ac29f" Dec 01 10:47:44 crc kubenswrapper[4763]: I1201 10:47:44.059913 4763 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l5kgb" event={"ID":"f95ef452-7057-4afb-a8ca-1c505b953c2e","Type":"ContainerStarted","Data":"497546eb8675ccc5a6e35e0e728d97997ad87c5512da0909198338eeec123d98"}